• Degree Completion Plans
  • Course Guides
  • Supplemental Instruction
  • IT Helpdesk
  • Academic Departments
  • Doctoral Degrees
  • Communications
  • Criminal Justice
  • Public Policy
  • Strategic Leadership
  • Worship Studies
  • More Programs >
  • Masters Degrees
  • Applied Psychology
  • Business Administration
  • Clinical Mental Health Counseling
  • Executive Leadership
  • Healthcare Administration
  • Political Science
  • Public Administration
  • Social Work
  • Bachelor's Degrees
  • Graphic Design
  • Information Technology
  • Paralegal Studies
  • Sports Management
  • Associate Degrees
  • Christian Counseling
  • Creative Writing
  • Early Childhood Education
  • Information Systems
  • Interdisciplinary Studies
  • Medical Office Assistant
  • STEM Mathematics
  • Undergraduate
  • Christian Ministry
  • Data Networking
  • Project Management
  • Biblical Studies
  • Educational Tech. & Online Instruction
  • General Business
  • Health Promotion
  • Theological Studies
  • Curriculum and Instruction
  • Instructional Design
  • Higher Ed. Administration
  • Special Education
  • New Programs
  • Biblical Counseling (BS)
  • Chaplaincy (MA)
  • Christian Leadership – Faith-Based Consulting (PhD)
  • Educational Research (PhD)
  • Fire Administration – Emergency Medical Services (BS)
  • Geographic Information Systems – Commercial Logistics (MS)
  • Healthcare Law and Compliance (MBA)
  • Instructional Design and Technology (EdS)
  • Interdisciplinary Research (MA)
  • International Relations – Human Rights (MS)
  • Philosophy, Politics, and Economics (BS)
  • Special Education (EdD)
  • Who Are We?
  • Our Three A's
  • Virtual Tour of Liberty's Campus
  • What is a Nonprofit University?
  • Why Choose Liberty?
  • Accreditation
  • Top 10 Reasons to Choose Liberty University
  • Video Testimonials
  • Annual Security Report
  • Annual Security Report 2023
  • Admission Information
  • Getting Started With Liberty
  • Admission Process
  • Admission FAQs
  • Academic Calendar
  • Admission Resources
  • Common Forms and Documents
  • Technical Requirements
  • Official Transcript Request Form
  • Textbooks and Software
  • Transferring to Liberty
  • Transfer Students
  • Experience Plus – Credit for Life Experience
  • Transfer FAQs
  • University Transcript Request Links
  • Tuition Assistance
  • First Responder Discount
  • Military Tuition Discount
  • Small Business Discount
  • Corporate Tuition Assistance
  • Corporate Tuition Affiliates
  • Financial Basics
  • Tuition & Fees
  • Payment Plans
  • Military Benefits
  • Financial Check-In
  • Financial Aid
  • Financial Aid Process
  • Financial Aid FAQs
  • Grants & Loans
  • Scholarship Opportunities
  • Military Homepage
  • Military Benefits Guide
  • Discount on Tuition
  • Doctoral Military Rate
  • Veterans Benefits
  • Academics and Programs
  • Military Programs and Partnerships
  • Military Benefits and Scholarships
  • Community and Resources
  • Top Used Links
  • Upcoming Events
  • Academic Advising
  • Jerry Falwell Library
  • Policies and Deadlines
  • Liberty University Academic Calendar Online
  • Academic Policies
  • Information Technology (IT)
  • Online Writing Center
  • Honor Societies
  • Student Advocate Office
  • Flames Pass (Student ID)
  • Online Student Life
  • Office of Disability Accommodation Support
  • Commonly Used Forms
  • learn.liberty.edu

Technology and Diversity – EDUC 629

CG • Section 8WK • 11/08/2019 to 04/16/2020 • Modified 09/05/2023

Request Info

Course Description

This course focuses on the nature of individual learners as it impacts instructional design with the use of technology in education. Specifically, the course covers topics on tools, methods and approaches to meet learning needs of children with special learning, those of differing socioeconomic and cultural backgrounds, and those of varying learning styles and preferences. The course attempts to bridge the gap between theoretical implications and practical applications of diversity considerations in the context of educational technology integration.

For information regarding prerequisites for this course, please refer to the  Academic Course Catalog .

Leaders focusing on educational technology design and management must be aware of the changing demographic landscape of education today. Students from rich, diverse backgrounds—in culture, socioeconomic status, learning ability, and learning preference—reveal the need for efforts in developing effective means for planning individual learning experiences. Issues of awareness, design, and effective application and implementation become important in this light. This course focuses on these very issues.

Course Assignment

Textbook readings and lecture presentations

Course Requirements Checklist

After reading the Course Syllabus and Student Expectations , the candidate will complete the related checklist found in the Course Overview.

Discussions (2)

Discussions are collaborative learning experiences. Therefore, the candidate is required to create a thread in response to the provided prompt for each Discussion. Each thread must be at least 200 words and contain at least 2 references cited in current APA format. In addition to the thread, the candidate is required to reply to at least 2 classmates’ threads. Each reply must be at least 100 words and contain at least 1 reference in current APA format. (CLO: A, B, D, F)

Technology Standards Infographic Assignment

For this assignment, the candidate will provide an Infographic of the International Standards for Technology Educators (ISTE), the Universal Design for Learning (UDL) guidelines, and the Technology Pedagogical Content Knowledge (TPCK) framework. The candidate must then provide an overview of each standard’s unique qualities, as well as an explanation of how they can be applied in the field of instructional design and technology. The goal of this assignment is to demonstrate an understanding of each of these standards and provide a foundation for how they will be applied through the candidate’s Technology Implementation Plan. (CLO: C, D, E)

Learning Community Demographics Paper Assignment

The candidate will write a 2-3-page paper in current APA format that focuses on the demographics of their local community.  The paper must include a minimum of 3 references. (CLO: B, C, E)

Technology Application Review and Demonstration Assignment

The candidate will create a 5-10 slide presentation that focuses on a critique of three current applications/programs available for classroom instruction.  Then, the candidate will select one of the three reviewed programs that best aligns to their program goals.  They will develop a 3-5 minute screencast demonstrating how to use the program in their classroom.  (CLO: A, D, E)

Technology Application Professional Development and Lesson Plans Assignment

The candidate will develop a professional development plan describing how teachers will be prepared to use the new technology in their classroom instruction.  Then, the candidate will create two formal lesson plans demonstrating how the students will use the technology during lessons. (CLO: A, B, C, D, E)

Technology Tool Assessment Assignment

The candidate will write a one-page paper in current APA format that focuses on the goals and objectives of their technology plan.  Then, the candidate will create two surveys, one for teachers and one for students, to aid in the evaluation of their technology plan. (CLO: A, B, C, D, E, F)

Assistive Technology Application for All Learners Assignment

The candidate will write a 3-5 page paper in current APA format that focuses on the assistive technologies available with the selected program they have chosen to include in their curriculum.  The paper must include a minimum of two references. (CLO: B, D, E)

Almost there! How may we contact you?

Our Admissions team is ready to answer any additional questions you may have.

By submitting contact information through this form, I agree that Liberty University and its affiliates may call and/or text me about its offerings by any phone number I have provided and may provide in the future, including any wireless number, using automated technology.

Message and data rates may apply. For additional information, text HELP to 49595 or 49596. You may opt-out at any time by sending STOP to 49595 or 49596. Visit for Terms & Conditions and Privacy Policy.

  • Get My Results

Discover what Liberty can do for you!

Get your personalized guide on how to start with liberty..

In 60 seconds or less!

Become a Champion for Christ

Estimate your Cost

Cost Per Credit Hour Per Semester for 7 to 15 Credits* Per Semester for 9 to 15 Credits* i Visit the Tuition and Financing page for more information.

Additional program fees may apply. See program page for details.

Disclaimer: This calculator is a tool that provides a rough estimate of the total cost of tuition, and should not be relied upon to determine overall costs, as pricing may vary by program and tuition/fees are subject to change. Estimates are not final or binding, and do not include potential financial aid eligibility.

Your Cost Estimate:

View All Tuition & Fees Go Back

For eligibility requirements for military discounts at the doctoral level, please review the online benefits page .

Request Information

Learn More About Liberty University Online

You will be automatically taken to the application once you submit your request for information

Message and data rates may apply. For additional information, text HELP to 49595 or 49596. You may opt-out at any time by sending STOP to 49595 or 49596. Visit for Terms & Conditions and Privacy Policy .

You have to have a lot of self-motivation and self-discipline when you are going to school online, but the amazing thing is at Liberty you do not need to do it by yourself. You really do have resources like someone who is going to school on campus.

– Janae Fleming ’15, B.S. in Education

  • Review article
  • Open access
  • Published: 02 October 2017

Computer-based technology and student engagement: a critical review of the literature

  • Laura A. Schindler   ORCID: orcid.org/0000-0001-8730-5189 1 ,
  • Gary J. Burkholder 2 , 3 ,
  • Osama A. Morad 1 &
  • Craig Marsh 4  

International Journal of Educational Technology in Higher Education volume  14 , Article number:  25 ( 2017 ) Cite this article

380k Accesses

136 Citations

39 Altmetric

Metrics details

Computer-based technology has infiltrated many aspects of life and industry, yet there is little understanding of how it can be used to promote student engagement, a concept receiving strong attention in higher education due to its association with a number of positive academic outcomes. The purpose of this article is to present a critical review of the literature from the past 5 years related to how web-conferencing software, blogs, wikis, social networking sites ( Facebook and Twitter ), and digital games influence student engagement. We prefaced the findings with a substantive overview of student engagement definitions and indicators, which revealed three types of engagement (behavioral, emotional, and cognitive) that informed how we classified articles. Our findings suggest that digital games provide the most far-reaching influence across different types of student engagement, followed by web-conferencing and Facebook . Findings regarding wikis, blogs, and Twitter are less conclusive and significantly limited in number of studies conducted within the past 5 years. Overall, the findings provide preliminary support that computer-based technology influences student engagement, however, additional research is needed to confirm and build on these findings. We conclude the article by providing a list of recommendations for practice, with the intent of increasing understanding of how computer-based technology may be purposefully implemented to achieve the greatest gains in student engagement.

Introduction

The digital revolution has profoundly affected daily living, evident in the ubiquity of mobile devices and the seamless integration of technology into common tasks such as shopping, reading, and finding directions (Anderson, 2016 ; Smith & Anderson, 2016 ; Zickuhr & Raine, 2014 ). The use of computers, mobile devices, and the Internet is at its highest level to date and expected to continue to increase as technology becomes more accessible, particularly for users in developing countries (Poushter, 2016 ). In addition, there is a growing number of people who are smartphone dependent, relying solely on smartphones for Internet access (Anderson & Horrigan, 2016 ) rather than more expensive devices such as laptops and tablets. Greater access to and demand for technology has presented unique opportunities and challenges for many industries, some of which have thrived by effectively digitizing their operations and services (e.g., finance, media) and others that have struggled to keep up with the pace of technological innovation (e.g., education, healthcare) (Gandhi, Khanna, & Ramaswamy, 2016 ).

Integrating technology into teaching and learning is not a new challenge for universities. Since the 1900s, administrators and faculty have grappled with how to effectively use technical innovations such as video and audio recordings, email, and teleconferencing to augment or replace traditional instructional delivery methods (Kaware & Sain, 2015 ; Westera, 2015 ). Within the past two decades, however, this challenge has been much more difficult due to the sheer volume of new technologies on the market. For example, in the span of 7 years (from 2008 to 2015), the number of active apps in Apple’s App Store increased from 5000 to 1.75 million. Over the next 4 years, the number of apps is projected to rise by 73%, totaling over 5 million (Nelson, 2016 ). Further compounding this challenge is the limited shelf life of new devices and software combined with significant internal organizational barriers that hinder universities from efficiently and effectively integrating new technologies (Amirault, 2012 ; Kinchin, 2012 ; Linder-VanBerschot & Summers 2015 ; Westera, 2015 ).

Many organizational barriers to technology integration arise from competing tensions between institutional policy and practice and faculty beliefs and abilities. For example, university administrators may view technology as a tool to attract and retain students, whereas faculty may struggle to determine how technology coincides with existing pedagogy (Lawrence & Lentle-Keenan, 2013 ; Lin, Singer, & Ha, 2010 ). In addition, some faculty may be hesitant to use technology due to lack of technical knowledge and/or skepticism about the efficacy of technology to improve student learning outcomes (Ashrafzadeh & Sayadian, 2015 ; Buchanan, Sainter, & Saunders, 2013 ; Hauptman, 2015 ; Johnson, 2013 ; Kidd, Davis, & Larke, 2016 ; Kopcha, Rieber, & Walker, 2016 ; Lawrence & Lentle-Keenan, 2013 ; Lewis, Fretwell, Ryan, & Parham, 2013 ; Reid, 2014 ). Organizational barriers to technology adoption are particularly problematic given the growing demands and perceived benefits among students about using technology to learn (Amirault, 2012 ; Cassidy et al., 2014 ; Gikas & Grant, 2013 ; Paul & Cochran, 2013 ). Surveys suggest that two-thirds of students use mobile devices for learning and believe that technology can help them achieve learning outcomes and better prepare them for a workforce that is increasingly dependent on technology (Chen, Seilhamer, Bennett, & Bauer, 2015 ; Dahlstrom, 2012 ). Universities that fail to effectively integrate technology into the learning experience miss opportunities to improve student outcomes and meet the expectations of a student body that has grown accustomed to the integration of technology into every facet of life (Amirault, 2012 ; Cook & Sonnenberg, 2014 ; Revere & Kovach, 2011 ; Sun & Chen, 2016 ; Westera, 2015 ).

The purpose of this paper is to provide a literature review on how computer-based technology influences student engagement within higher education settings. We focused on computer-based technology given the specific types of technologies (i.e., web-conferencing software, blogs, wikis, social networking sites, and digital games) that emerged from a broad search of the literature, which is described in more detail below. Computer-based technology (hereafter referred to as technology) requires the use of specific hardware, software, and micro processing features available on a computer or mobile device. We also focused on student engagement as the dependent variable of interest because it encompasses many different aspects of the teaching and learning process (Bryson & Hand, 2007 ; Fredricks, Blumenfeld, & Parks, 1994; Wimpenny & Savin-Baden, 2013 ), compared narrower variables in the literature such as final grades or exam scores. Furthermore, student engagement has received significant attention over the past several decades due to shifts towards student-centered, constructivist instructional methods (Haggis, 2009 ; Wright, 2011 ), mounting pressures to improve teaching and learning outcomes (Axelson & Flick, 2011 ; Kuh, 2009 ), and promising studies suggesting relationships between student engagement and positive academic outcomes (Carini, Kuh, & Klein, 2006 ; Center for Postsecondary Research, 2016 ; Hu & McCormick, 2012 ). Despite the interest in student engagement and the demand for more technology in higher education, there are no articles offering a comprehensive review of how these two variables intersect. Similarly, while many existing student engagement conceptual models have expanded to include factors that influence student engagement, none highlight the overt role of technology in the engagement process (Kahu, 2013 ; Lam, Wong, Yang, & Yi, 2012 ; Nora, Barlow, & Crisp, 2005 ; Wimpenny & Savin-Baden, 2013 ; Zepke & Leach, 2010 ).

Our review aims to address existing gaps in the student engagement literature and seeks to determine whether student engagement models should be expanded to include technology. The review also addresses some of the organizational barriers to technology integration (e.g., faculty uncertainty and skepticism about technology) by providing a comprehensive account of the research evidence regarding how technology influences student engagement. One limitation of the literature, however, is the lack of detail regarding how teaching and learning practices were used to select and integrate technology into learning. For example, the methodology section of many studies does not include a pedagogical justification for why a particular technology was used or details about the design of the learning activity itself. Therefore, it often is unclear how teaching and learning practices may have affected student engagement levels. We revisit this issue in more detail at the end of this paper in our discussions of areas for future research and recommendations for practice. We initiated our literature review by conducting a broad search for articles published within the past 5 years, using the key words technology and higher education , in Google Scholar and the following research databases: Academic Search Complete, Communication & Mass Media Complete, Computers & Applied Sciences Complete, Education Research Complete, ERIC, PsycARTICLES, and PsycINFO . Our initial search revealed themes regarding which technologies were most prevalent in the literature (e.g., social networking, digital games), which then lead to several, more targeted searches of the same databases using specific keywords such as Facebook and student engagement. After both broad and targeted searches, we identified five technologies (web-conferencing software, blogs, wikis, social networking sites, and digital games) to include in our review.

We chose to focus on technologies for which there were multiple studies published, allowing us to identify areas of convergence and divergence in the literature and draw conclusions about positive and negative effects on student engagement. In total, we identified 69 articles relevant to our review, with 36 pertaining to social networking sites (21 for Facebook and 15 for Twitter ), 14 pertaining to digital games, seven pertaining to wikis, and six pertaining to blogs and web-conferencing software respectively. Articles were categorized according to their influence on specific types of student engagement, which will be described in more detail below. In some instances, one article pertained to multiple types of engagement. In the sections that follow, we will provide an overview of student engagement, including an explanation of common definitions and indicators of engagement, followed by a synthesis of how each type of technology influences student engagement. Finally, we will discuss areas for future research and make recommendations for practice.

  • Student engagement

Interest in student engagement began over 70 years ago with Ralph Tyler’s research on the relationship between time spent on coursework and learning (Axelson & Flick, 2011 ; Kuh, 2009 ). Since then, the study of student engagement has evolved and expanded considerably, through the seminal works of Pace ( 1980 ; 1984 ) and Astin ( 1984 ) about how quantity and quality of student effort affect learning and many more recent studies on the environmental conditions and individual dispositions that contribute to student engagement (Bakker, Vergel, & Kuntze, 2015 ; Gilboy, Heinerichs, & Pazzaglia, 2015 ; Martin, Goldwasser, & Galentino, 2017 ; Pellas, 2014 ). Perhaps the most well-known resource on student engagement is the National Survey of Student Engagement (NSSE), an instrument designed to assess student participation in various educational activities (Kuh, 2009 ). The NSSE and other engagement instruments like it have been used in many studies that link student engagement to positive student outcomes such as higher grades, retention, persistence, and completion (Leach, 2016 ; McClenney, Marti, & Adkins, 2012 ; Trowler & Trowler, 2010 ), further convincing universities that student engagement is an important factor in the teaching and learning process. However, despite the increased interest in student engagement, its meaning is generally not well understood or agreed upon.

Student engagement is a broad and complex phenomenon for which there are many definitions grounded in psychological, social, and/or cultural perspectives (Fredricks et al., 1994; Wimpenny & Savin-Baden, 2013 ; Zepke & Leach, 2010 ). Review of definitions revealed that student engagement is defined in two ways. One set of definitions refer to student engagement as a desired outcome reflective of a student’s thoughts, feelings, and behaviors about learning. For example, Kahu ( 2013 ) defines student engagement as an “individual psychological state” that includes a student’s affect, cognition, and behavior (p. 764). Other definitions focus primarily on student behavior, suggesting that engagement is the “extent to which students are engaging in activities that higher education research has shown to be linked with high-quality learning outcomes” (Krause & Coates, 2008 , p. 493) or the “quality of effort and involvement in productive learning activities” (Kuh, 2009 , p. 6). Another set of definitions refer to student engagement as a process involving both the student and the university. For example, Trowler ( 2010 ) defined student engagement as “the interaction between the time, effort and other relevant resources invested by both students and their institutions intended to optimize the student experience and enhance the learning outcomes and development of students and the performance, and reputation of the institution” (p. 2). Similarly, the NSSE website indicates that student engagement is “the amount of time and effort students put into their studies and other educationally purposeful activities” as well as “how the institution deploys its resources and organizes the curriculum and other learning opportunities to get students to participate in activities that decades of research studies show are linked to student learning” (Center for Postsecondary Research, 2017 , para. 1).

Many existing models of student engagement reflect the latter set of definitions, depicting engagement as a complex, psychosocial process involving both student and university characteristics. Such models organize the engagement process into three areas: factors that influence student engagement (e.g., institutional culture, curriculum, and teaching practices), indicators of student engagement (e.g., interest in learning, interaction with instructors and peers, and meaningful processing of information), and outcomes of student engagement (e.g., academic achievement, retention, and personal growth) (Kahu, 2013 ; Lam et al., 2012 ; Nora et al., 2005 ). In this review, we examine the literature to determine whether technology influences student engagement. In addition, we will use Fredricks et al. ( 2004 ) typology of student engagement to organize and present research findings, which suggests that there are three types of engagement (behavioral, emotional, and cognitive). The typology is useful because it is broad in scope, encompassing different types of engagement that capture a range of student experiences, rather than narrower typologies that offer specific or prescriptive conceptualizations of student engagement. In addition, this typology is student-centered, focusing exclusively on student-focused indicators rather than combining student indicators with confounding variables, such as faculty behavior, curriculum design, and campus environment (Coates, 2008 ; Kuh, 2009 ). While such variables are important in the discussion of student engagement, perhaps as factors that may influence engagement, they are not true indicators of student engagement. Using the typology as a guide, we examined recent student engagement research, models, and measures to gain a better understanding of how behavioral, emotional, and cognitive student engagement are conceptualized and to identify specific indicators that correspond with each type of engagement, as shown in Fig. 1 .

Conceptual framework of types and indicators of student engagement

Behavioral engagement is the degree to which students are actively involved in learning activities (Fredricks et al., 2004 ; Kahu, 2013 ; Zepke, 2014 ). Indicators of behavioral engagement include time and effort spent participating in learning activities (Coates, 2008 ; Fredricks et al., 2004 ; Kahu, 2013 ; Kuh, 2009 ; Lam et al., 2012 ; Lester, 2013 ; Trowler, 2010 ) and interaction with peers, faculty, and staff (Coates, 2008 ; Kahu, 2013 ; Kuh, 2009 ; Bryson & Hand, 2007 ; Wimpenny & Savin-Baden, 2013 : Zepke & Leach, 2010 ). Indicators of behavioral engagement reflect observable student actions and most closely align with Pace ( 1980 ) and Astin’s ( 1984 ) original conceptualizations of student engagement as quantity and quality of effort towards learning. Emotional engagement is students’ affective reactions to learning (Fredricks et al., 2004 ; Lester, 2013 ; Trowler, 2010 ). Indicators of emotional engagement include attitudes, interests, and values towards learning (Fredricks et al., 2004 ; Kahu, 2013 ; Lester, 2013 ; Trowler, 2010 ; Wimpenny & Savin-Baden, 2013 ; Witkowski & Cornell, 2015 ) and a perceived sense of belonging within a learning community (Fredricks et al., 2004 ; Kahu, 2013 ; Lester, 2013 ; Trowler, 2010 ; Wimpenny & Savin-Baden, 2013 ). Emotional engagement often is assessed using self-report measures (Fredricks et al., 2004 ) and provides insight into how students feel about a particular topic, delivery method, or instructor. Finally, cognitive engagement is the degree to which students invest in learning and expend mental effort to comprehend and master content (Fredricks et al., 2004 ; Lester, 2013 ). Indicators of cognitive engagement include: motivation to learn (Lester, 2013 ; Richardson & Newby, 2006 ; Zepke & Leach, 2010 ); persistence to overcome academic challenges and meet/exceed requirements (Fredricks et al., 2004 ; Kuh, 2009 ; Trowler, 2010 ); and deep processing of information (Fredricks et al., 2004 ; Kahu, 2013 ; Lam et al., 2012 ; Richardson & Newby, 2006 ) through critical thinking (Coates, 2008 ; Witkowski & Cornell, 2015 ), self-regulation (e.g., set goals, plan, organize study effort, and monitor learning; Fredricks et al., 2004 ; Lester, 2013 ), and the active construction of knowledge (Coates, 2008 ; Kuh, 2009 ). While cognitive engagement includes motivational aspects, much of the literature focuses on how students use active learning and higher-order thinking, in some form, to achieve content mastery. For example, there is significant emphasis on the importance of deep learning, which involves analyzing new learning in relation previous knowledge, compared to surface learning, which is limited to memorization, recall, and rehearsal (Fredricks et al., 2004 ; Kahu, 2013 ; Lam et al., 2012 ).

While each type of engagement has distinct features, there is some overlap across cognitive, behavioral, and emotional domains. In instances where an indicator could correspond with more than one type of engagement, we chose to match the indicator to the type of engagement that most closely aligned, based on our review of the engagement literature and our interpretation of the indicators. Similarly, there is also some overlap among indicators. As a result, we combined and subsumed similar indicators found in the literature, where appropriate, to avoid redundancy. Achieving an in-depth understanding of student engagement and associated indicators was an important pre-cursor to our review of the technology literature. Very few articles used the term student engagement as a dependent variable given the concept is so broad and multidimensional. We found that specific indicators (e.g., interaction, sense of belonging, and knowledge construction) of student engagement were more common in the literature as dependent variables. Next, we will provide a synthesis of the findings regarding how different types of technology influence behavioral, emotional, and cognitive student engagement and associated indicators.

Influence of technology on student engagement

We identified five technologies post-literature search (i.e., web-conferencing, blogs, wikis, social networking sites , and digital games) to include in our review, based on frequency in which they appeared in the literature over the past 5 years. One commonality among these technologies is their potential value in supporting a constructivist approach to learning, characterized by the active discovery of knowledge through reflection of experiences with one’s environment, the connection of new knowledge to prior knowledge, and interaction with others (Boghossian, 2006 ; Clements, 2015 ). Another commonality is that most of the technologies, except perhaps for digital games, are designed primarily to promote interaction and collaboration with others. Our search yielded very few studies on how informational technologies, such as video lectures and podcasts, influence student engagement. Therefore, these technologies are notably absent from our review. Unlike the technologies we identified earlier, informational technologies reflect a behaviorist approach to learning in which students are passive recipients of knowledge that is transmitted from an expert (Boghossian, 2006 ). The lack of recent research on how informational technologies affect student engagement may be due to the increasing shift from instructor-centered, behaviorist approaches to student-centered, constructivist approaches within higher education (Haggis, 2009 ; Wright, 2011 ) along with the ubiquity of web 2.0 technologies.

  • Web-conferencing

Web-conferencing software provides a virtual meeting space where users login simultaneously and communicate about a given topic. While each software application is unique, many share similar features such as audio, video, or instant messaging options for real-time communication; screen sharing, whiteboards, and digital pens for presentations and demonstrations; polls and quizzes for gauging comprehension or eliciting feedback; and breakout rooms for small group work (Bower, 2011 ; Hudson, Knight, & Collins, 2012 ; Martin, Parker, & Deale, 2012 ; McBrien, Jones, & Cheng, 2009 ). Of the technologies included in this literature review, web-conferencing software most closely mimics the face-to-face classroom environment, providing a space where instructors and students can hear and see each other in real-time as typical classroom activities (i.e., delivering lectures, discussing course content, asking/answering questions) are carried out (Francescucci & Foster, 2013 ; Hudson et al., 2012 ). Studies on web-conferencing software deployed Adobe Connect, Cisco WebEx, Horizon Wimba, or Blackboard Collaborate and made use of multiple features, such as screen sharing, instant messaging, polling, and break out rooms. In addition, most of the studies integrated web-conferencing software into courses on a voluntary basis to supplement traditional instructional methods (Andrew, Maslin-Prothero, & Ewens, 2015 ; Armstrong & Thornton, 2012 ; Francescucci & Foster, 2013 ; Hudson et al., 2012 ; Martin et al., 2012 ; Wdowik, 2014 ). Existing studies on web-conferencing pertain to all three types of student engagement.

Studies on web-conferencing and behavioral engagement reveal mixed findings. For example, voluntary attendance in web-conferencing sessions ranged from 54 to 57% (Andrew et al., 2015 ; Armstrong & Thornton, 2012 ) and, in a comparison between a blended course with regular web-conferencing sessions and a traditional, face-to-face course, researchers found no significant difference in student attendance in courses. However, students in the blended course reported higher levels of class participation compared to students in the face-to-face course (Francescucci & Foster, 2013 ). These findings suggest while web-conferencing may not boost attendance, especially if voluntary, it may offer more opportunities for class participation, perhaps through the use of communication channels typically not available in a traditional, face-to-face course (e.g., instant messaging, anonymous polling). Studies on web-conferencing and interaction, another behavioral indicator, support this assertion. For example, researchers found that students use various features of web-conferencing software (e.g., polling, instant message, break-out rooms) to interact with peers and the instructor by asking questions, expressing opinions and ideas, sharing resources, and discussing academic content (Andrew et al., 2015 ; Armstrong & Thornton, 2012 ; Hudson et al., 2012 ; Martin et al., 2012 ; Wdowik, 2014 ).

Studies on web-conferencing and cognitive engagement are more conclusive than those for behavioral engagement, although are fewer in number. Findings suggest that students who participated in web-conferencing demonstrated critical reflection and enhanced learning through interactions with others (Armstrong & Thornton, 2012 ), higher-order thinking (e.g., problem-solving, synthesis, evaluation) in response to challenging assignments (Wdowik, 2014 ), and motivation to learn, particularly when using polling features (Hudson et al., 2012 ). There is only one study examining how web-conferencing affects emotional engagement, although it is positive suggesting that students who participated in web-conferences had higher levels of interest in course content than those who did not (Francescucci & Foster, 2013 ). One possible reason for the positive cognitive and emotional engagement findings may be that web-conferencing software provides many features that promote active learning. For example, whiteboards and breakout rooms provide opportunities for real-time, collaborative problem-solving activities and discussions. However, additional studies are needed to isolate and compare specific web-conferencing features to determine which have the greatest effect on student engagement.

A blog, which is short for Weblog, is a collection of personal journal entries, published online and presented chronologically, to which readers (or subscribers) may respond by providing additional commentary or feedback. In order to create a blog, one must compose content for an entry, which may include text, hyperlinks, graphics, audio, or video, publish the content online using a blogging application, and alert subscribers that new content is posted. Blogs may be informal and personal in nature or may serve as formal commentary in a specific genre, such as in politics or education (Coghlan et al., 2007 ). Fortunately, many blog applications are free, and many learning management systems (LMSs) offer a blogging feature that is seamlessly integrated into the online classroom. The ease of blogging has attracted attention from educators, who currently use blogs as an instructional tool for the expression of ideas, opinions, and experiences and for promoting dialogue on a wide range of academic topics (Garrity, Jones, VanderZwan, de la Rocha, & Epstein, 2014 ; Wang, 2008 ).

Studies on blogs show consistently positive findings for many of the behavioral and emotional engagement indicators. For example, students reported that blogs promoted interaction with others, through greater communication and information sharing with peers (Chu, Chan, & Tiwari, 2012 ; Ivala & Gachago, 2012 ; Mansouri & Piki, 2016 ), and analyses of blog posts show evidence of students elaborating on one another’s ideas and sharing experiences and conceptions of course content (Sharma & Tietjen, 2016 ). Blogs also contribute to emotional engagement by providing students with opportunities to express their feelings about learning and by encouraging positive attitudes about learning (Dos & Demir, 2013 ; Chu et al., 2012 ; Yang & Chang, 2012 ). For example, Dos and Demir ( 2013 ) found that students expressed prejudices and fears about specific course topics in their blog posts. In addition, Yang and Chang ( 2012 ) found that interactive blogging, where comment features were enabled, lead to more positive attitudes about course content and peers compared to solitary blogging, where comment features were disabled.

The literature on blogs and cognitive engagement is less consistent. Some studies suggest that blogs may help students engage in active learning, problem-solving, and reflection (Chawinga, 2017 ; Chu et al., 2012 ; Ivala & Gachago, 2012 ; Mansouri & Piki, 2016 ), while other studies suggest that students’ blog posts show very little evidence of higher-order thinking (Dos & Demir, 2013 ; Sharma & Tietjen, 2016 ). The inconsistency in findings may be due to the wording of blog instructions. Students may not necessarily demonstrate or engage in deep processing of information unless explicitly instructed to do so. Unfortunately, it is difficult to determine whether the wording of blog assignments contributed to the mixed results because many of the studies did not provide assignment details. However, studies pertaining to other technologies suggest that assignment wording that lacks specificity or requires low-level thinking can have detrimental effects on student engagement outcomes (Hou, Wang, Lin, & Chang, 2015 ; Prestridge, 2014 ). Therefore, blog assignments that are vague or require only low-level thinking may have adverse effects on cognitive engagement.

A wiki is a web page that can be edited by multiple users at once (Nakamaru, 2012 ). Wikis have gained popularity in educational settings as a viable tool for group projects where group members can work collaboratively to develop content (i.e., writings, hyperlinks, images, graphics, media) and keep track of revisions through an extensive versioning system (Roussinos & Jimoyiannis, 2013 ). Most studies on wikis pertain to behavioral engagement, with far fewer studies on cognitive engagement and none on emotional engagement. Studies pertaining to behavioral engagement reveal mixed results, with some showing very little enduring participation in wikis beyond the first few weeks of the course (Nakamaru, 2012 ; Salaber, 2014 ) and another showing active participation, as seen in high numbers of posts and edits (Roussinos & Jimoyiannis, 2013 ). The most notable difference between these studies is the presence of grading, which may account for the inconsistencies in findings. For example, in studies where participation was low, wikis were ungraded, suggesting that students may need extra motivation and encouragement to use wikis (Nakamaru, 2012 ; Salaber, 2014 ). Findings regarding the use of wikis for promoting interaction are also inconsistent. In some studies, students reported that wikis were useful for interaction, teamwork, collaboration, and group networking (Camacho, Carrión, Chayah, & Campos, 2016 ; Martínez, Medina, Albalat, & Rubió, 2013 ; Morely, 2012 ; Calabretto & Rao, 2011 ) and researchers found evidence of substantial collaboration among students (e.g., sharing ideas, opinions, and points of view) in wiki activity (Hewege & Perera, 2013 ); however, Miller, Norris, and Bookstaver ( 2012 ) found that only 58% of students reported that wikis promoted collegiality among peers. The findings in the latter study were unexpected and may be due to design flaws in the wiki assignments. For example, the authors noted that wiki assignments were not explicitly referred to in face-to-face classes; therefore, this disconnect may have prevented students from building on interactive momentum achieved during out-of-class wiki assignments (Miller et al., 2012 ).

Studies regarding cognitive engagement are limited in number but more consistent than those concerning behavioral engagement, suggesting that wikis promote high levels of knowledge construction (i.e., evaluation of arguments, the integration of multiple viewpoints, new understanding of course topics; Hewege & Perera, 2013 ), and are useful for reflection, reinforcing course content, and applying academic skills (Miller et al., 2012 ). Overall, there is mixed support for the use of wikis to promote behavioral engagement, although making wiki assignments mandatory and explicitly referring to wikis in class may help bolster participation and interaction. In addition, there is some support for using wikis to promote cognitive engagement, but additional studies are needed to confirm and expand on findings as well as explore the effect of wikis on emotional engagement.

Social networking sites

Social networking is “the practice of expanding knowledge by making connections with individuals of similar interests” (Gunawardena et al., 2009 , p. 4). Social networking sites, such as Facebook, Twitter, Instagram, and LinkedIn, allow users to create and share digital content publicly or with others to whom they are connected and communicate privately through messaging features. Two of the most popular social networking sites in the educational literature are Facebook and Twitter (Camus, Hurt, Larson, & Prevost, 2016 ; Manca & Ranieri, 2013 ), which is consistent with recent statistics suggesting that both sites also are exceedingly popular among the general population (Greenwood, Perrin, & Duggan, 2016 ). In the sections that follow, we examine how both Facebook and Twitter influence different types of student engagement.

Facebook is a web-based service that allows users to create a public or private profile and invite others to connect. Users may build social, academic, and professional connections by posting messages in various media formats (i.e., text, pictures, videos) and commenting on, liking, and reacting to others’ messages (Bowman & Akcaoglu, 2014 ; Maben, Edwards, & Malone, 2014 ; Hou et al., 2015 ). Within an educational context, Facebook has often been used as a supplementary instructional tool to lectures or LMSs to support class discussions or develop, deliver, and share academic content and resources. Many instructors have opted to create private Facebook groups, offering an added layer of security and privacy because groups are not accessible to strangers (Bahati, 2015 ; Bowman & Akcaoglu, 2014 ; Clements, 2015 ; Dougherty & Andercheck, 2014 ; Esteves, 2012 ; Shraim, 2014 ; Maben et al., 2014 ; Manca & Ranieri, 2013 ; Naghdipour & Eldridge, 2016 ; Rambe, 2012 ). The majority of studies on Facebook address behavioral indicators of student engagement, with far fewer focusing on emotional or cognitive engagement.

Studies that examine the influence of Facebook on behavioral engagement focus both on participation in learning activities and interaction with peers and instructors. In most studies, Facebook activities were voluntary and participation rates ranged from 16 to 95%, with an average of rate of 47% (Bahati, 2015 ; Bowman & Akcaoglu, 2014 ; Dougherty & Andercheck, 2014 ; Fagioli, Rios-Aguilar, & Deil-Amen, 2015 ; Rambe, 2012 ; Staines & Lauchs, 2013 ). Participation was assessed by tracking how many students joined course- or university-specific Facebook groups (Bahati, 2015 ; Bowman & Akcaoglu, 2014 ; Fagioli et al., 2015 ), visited or followed course-specific Facebook pages (DiVall & Kirwin, 2012 ; Staines & Lauchs, 2013 ), or posted at least once in a course-specific Facebook page (Rambe, 2012 ). The lowest levels of participation (16%) arose from a study where community college students were invited to use the Schools App, a free application that connects students to their university’s private Facebook community. While the authors acknowledged that building an online community of college students is difficult (Fagioli et al., 2015 ), downloading the Schools App may have been a deterrent to widespread participation. In addition, use of the app was not tied to any specific courses or assignments; therefore, students may have lacked adequate incentive to use it. The highest level of participation (95%) in the literature arose from a study in which the instructor created a Facebook page where students could find or post study tips or ask questions. Followership to the page was highest around exams, when students likely had stronger motivations to access study tips and ask the instructor questions (DiVall & Kirwin, 2012 ). The wide range of participation in Facebook activities suggests that some students may be intrinsically motivated to participate, while other students may need some external encouragement. For example, Bahati ( 2015 ) found that when students assumed that a course-specific Facebook was voluntary, only 23% participated, but when the instructor confirmed that the Facebook group was, in fact, mandatory, the level of participation rose to 94%.

While voluntary participation in Facebook activities may be lower than desired or expected (Dyson, Vickers, Turtle, Cowan, & Tassone, 2015 ; Fagioli et al., 2015 ; Naghdipour & Eldridge, 2016 ; Rambe, 2012 ), students seem to have a clear preference for Facebook compared to other instructional tools (Clements, 2015 ; DiVall & Kirwin, 2012 ; Hurt et al., 2012 ; Hou et al., 2015 ; Kent, 2013 ). For example, in one study where an instructor shared course-related information in a Facebook group, in the LMS, and through email, the level of participation in the Facebook group was ten times higher than in email or the LMS (Clements, 2015 ). In other studies, class discussions held in Facebook resulted in greater levels of participation and dialogue than class discussions held in LMS discussion forums (Camus et al., 2016 ; Hurt et al., 2012 ; Kent, 2013 ). Researchers found that preference for Facebook over the university’s LMS is due to perceptions that the LMS is outdated and unorganized and reports that Facebook is more familiar, convenient, and accessible given that many students already visit the social networking site multiple times per day (Clements, 2015 ; Dougherty & Andercheck, 2014 ; Hurt et al., 2012 ; Kent, 2013 ). In addition, students report that Facebook helps them stay engaged in learning through collaboration and interaction with both peers and instructors (Bahati, 2015 ; Shraim, 2014 ), which is evident in Facebook posts where students collaborated to study for exams, consulted on technical and theoretical problem solving, discussed course content, exchanged learning resources, and expressed opinions as well as academic successes and challenges (Bowman & Akcaoglu, 2014 ; Dougherty & Andercheck, 2014 ; Esteves, 2012 Ivala & Gachago, 2012 ; Maben et al., 2014 ; Rambe, 2012 ; van Beynen & Swenson, 2016 ).

There is far less evidence in the literature about the use of Facebook for emotional and cognitive engagement. In terms of emotional engagement, studies suggest that students feel positively about being part of a course-specific Facebook group and that Facebook is useful for expressing feelings about learning and concerns for peers, through features such as the “like” button and emoticons (Bowman & Akcaoglu, 2014 ; Dougherty & Andercheck, 2014 ; Naghdipour & Eldridge, 2016 ). In addition, being involved in a course-specific Facebook group was positively related to students’ sense of belonging in the course (Dougherty & Andercheck, 2014 ). The research on cognitive engagement is less conclusive, with some studies suggesting that Facebook participation is related to academic persistence (Fagioli et al., 2015 ) and self-regulation (Dougherty & Andercheck, 2014 ) while other studies show low levels of knowledge construction in Facebook posts (Hou et al., 2015 ), particularly when compared to discussions held in the LMS. One possible reason may be because the LMS is associated with formal, academic interactions while Facebook is associated with informal, social interactions (Camus et al., 2016 ). While additional research is needed to confirm the efficacy of Facebook for promoting cognitive engagement, studies suggest that Facebook may be a viable tool for increasing specific behavioral and emotional engagement indicators, such as interactions with others and a sense of belonging within a learning community.

Twitter is a web-based service where subscribers can post short messages, called tweets, in real-time that are no longer than 140 characters in length. Tweets may contain hyperlinks to other websites, images, graphics, and/or videos and may be tagged by topic using the hashtag symbol before the designated label (e.g., #elearning). Twitter subscribers may “follow” other users and gain access to their tweets and also may “retweet” messages that have already been posted (Hennessy, Kirkpatrick, Smith, & Border, 2016 ; Osgerby & Rush, 2015 ; Prestridge, 2014 ; West, Moore, & Barry, 2015 ; Tiernan, 2014 ;). Instructors may use Twitter to post updates about the course, clarify expectations, direct students to additional learning materials, and encourage students to discuss course content (Bista, 2015 ; Williams & Whiting, 2016 ). Several of the studies on the use of Twitter included broad, all-encompassing measures of student engagement and produced mixed findings. For example, some studies suggest that Twitter increases student engagement (Evans, 2014 ; Gagnon, 2015 ; Junco, Heibergert, & Loken, 2011 ) while other studies suggest that Twitter has little to no influence on student engagement (Junco, Elavsky, & Heiberger, 2013 ; McKay, Sanko, Shekhter, & Birnbach, 2014 ). In both studies suggesting little to no influence on student engagement, Twitter use was voluntary and in one of the studies faculty involvement in Twitter was low, which may account for the negative findings (Junco et al., 2013 ; McKay et al., 2014 ). Conversely, in the studies that show positive findings, Twitter use was mandatory and often directly integrated with required assignments (Evans, 2014 ; Gagnon, 2015 ; Junco et al., 2011 ). Therefore, making Twitter use mandatory, increasing faculty involvement in Twitter, and integrating Twitter into assignments may help to increase student engagement.

Studies pertaining to specific behavioral student engagement indicators also reveal mixed findings. For example, in studies where course-related Twitter use was voluntary, 45-91% of students reported using Twitter during the term (Hennessy et al., 2016 ; Junco et al., 2013 ; Ross, Banow, & Yu, 2015 ; Tiernan, 2014 ; Williams & Whiting, 2016 ), but only 30-36% reported making contributions to the course-specific Twitter page (Hennessy et al., 2016 ; Tiernan, 2014 ; Ross et al., 2015 ; Williams & Whiting, 2016 ). The study that reported a 91% participation rate was unique because the course-specific Twitter page was accessible via a public link. Therefore, students who chose only to view the content (58%), rather than contribute to the page, did not have to create a Twitter account (Hennessy et al., 2016 ). The convenience of not having to create an account may be one reason for much higher participation rates. In terms of low participation rates, a lack of literacy, familiarity, and interest in Twitter , as well as a preference for Facebook , are cited as contributing factors (Bista, 2015 ; McKay et al., 2014 ; Mysko & Delgaty, 2015 ; Osgerby & Rush, 2015 ; Tiernan, 2014 ). However, when the use of Twitter was required and integrated into class discussions, the participation rate was 100% (Gagnon, 2015 ). Similarly, 46% of students in one study indicated that they would have been more motivated to participate in Twitter activities if they were graded (Osgerby & Rush, 2015 ), again confirming the power of extrinsic motivating factors.

Studies also show mixed results for the use of Twitter to promote interactions with peers and instructors. Researchers found that when instructors used Twitter to post updates about the course, ask and answer questions, and encourage students to tweet about course content, there was evidence of student-student and student-instructor interactions in tweets (Hennessy et al., 2016 ; Tiernan, 2014 ). Some students echoed these findings, suggesting that Twitter is useful for sharing ideas and resources, discussing course content, asking the instructor questions, and networking (Chawinga, 2017 ; Evans, 2014 ; Gagnon, 2015 ; Hennessy et al., 2016 ; Mysko & Delgaty, 2015 ; West et al., 2015 ) and is preferable over speaking aloud in class because it is more comfortable, less threatening, and more concise due to the 140 character limit (Gagnon, 2015 ; Mysko & Delgaty, 2015 ; Tiernan, 2014 ). Conversely, other students reported that Twitter was not useful for improving interaction because they viewed it predominately for social, rather than academic, interactions and they found the 140 character limit to be frustrating and restrictive. A theme among the latter studies was that a large proportion of the sample had never used Twitter before (Bista, 2015 ; McKay et al., 2014 ; Osgerby & Rush, 2015 ), which may have contributed to negative perceptions.

The literature on the use of Twitter for cognitive and emotional engagement is minimal but nonetheless promising in terms of promoting knowledge gains, the practical application of content, and a sense of belonging among users. For example, using Twitter to respond to questions that arose in lectures and tweet about course content throughout the term is associated with increased understanding of course content and application of knowledge (Kim et al., 2015 ; Tiernan, 2014 ; West et al., 2015 ). While the underlying mechanisms pertaining to why Twitter promotes an understanding of content and application of knowledge are not entirely clear, Tiernan ( 2014 ) suggests that one possible reason may be that Twitter helps to break down communication barriers, encouraging shy or timid students to participate in discussions that ultimately are richer in dialogue and debate. In terms of emotional engagement, students who participated in a large, class-specific Twitter page were more likely to feel a sense of community and belonging compared to those who did not participate because they could more easily find support from and share resources with other Twitter users (Ross et al., 2015 ). Despite the positive findings about the use of Twitter for cognitive and emotional engagement, more studies are needed to confirm existing results regarding behavioral engagement and target additional engagement indicators such as motivation, persistence, and attitudes, interests, and values about learning. In addition, given the strong negative perceptions of Twitter that still exist, additional studies are needed to confirm Twitter ’s efficacy for promoting different types of behavioral engagement among both novice and experienced Twitter users, particularly when compared to more familiar tools such as Facebook or LMS discussion forums.

  • Digital games

Digital games are “applications using the characteristics of video and computer games to create engaging and immersive learning experiences for delivery of specified learning goals, outcomes and experiences” (de Freitas, 2006 , p. 9). Digital games often serve the dual purpose of promoting the achievement of learning outcomes while making learning fun by providing simulations of real-world scenarios as well as role play, problem-solving, and drill and repeat activities (Boyle et al., 2016 ; Connolly, Boyle, MacArthur, Hainey, & Boyle, 2012 ; Scarlet & Ampolos, 2013 ; Whitton, 2011 ). In addition, gamified elements, such as digital badges and leaderboards, may be integrated into instruction to provide additional motivation for completing assigned readings and other learning activities (Armier, Shepherd, & Skrabut, 2016 ; Hew, Huang, Chu, & Chiu, 2016 ). The pedagogical benefits of digital games are somewhat distinct from the other technologies addressed in this review, which are designed primarily for social interaction. While digital games may be played in teams or allow one player to compete against another, the focus of their design often is on providing opportunities for students to interact with academic content in a virtual environment through decision-making, problem-solving, and reward mechanisms. For example, a digital game may require students to adopt a role as CEO in a computer-simulated business environment, make decisions about a series of organizational issues, and respond to the consequences of those decisions. In this example and others, digital games use adaptive learning principles, where the learning environment is re-configured or modified in response to the actions and needs of students (Bower, 2016 ). Most of the studies on digital games focused on cognitive and emotional indicators of student engagement, in contrast to the previous technologies addressed in this review which primarily focused on behavioral indicators of engagement.

Existing studies provide support for the influence of digital games on cognitive engagement, through achieving a greater understanding of course content and demonstrating higher-order thinking skills (Beckem & Watkins, 2012 ; Farley, 2013 ; Ke, Xie, & Xie, 2016 ; Marriott, Tan, & Marriott, 2015 ), particularly when compared to traditional instructional methods, such as giving lectures or assigning textbook readings (Lu, Hallinger, & Showanasai, 2014 ; Siddique, Ling, Roberson, Xu, & Geng, 2013 ; Zimmermann, 2013 ). For example, in a study comparing courses that offered computer simulations of business challenges (e.g, implementing a new information technology system, managing a startup company, and managing a brand of medicine in a simulated market environment) and courses that did not, students in simulation-based courses reported higher levels of action-directed learning (i.e., connecting theory to practice in a business context) than students in traditional, non-simulation-based courses (Lu et al., 2014 ). Similarly, engineering students who participated in a car simulator game, which was designed to help students apply and reinforce the knowledge gained from lectures, demonstrated higher levels of critical thinking (i.e., analysis, evaluation) on a quiz than students who only attended lectures (Siddique et al., 2013 ).

Motivation is another cognitive engagement indicator that is linked to digital games (Armier et al., 2016 ; Chang & Wei, 2016 ; Dichev & Dicheva, 2017 ; Grimley, Green, Nilsen, & Thompson, 2012 ; Hew et al., 2016 ; Ibáñez, Di-Serio, & Delgado-Kloos, 2014 ; Ke et al., 2016 ; Liu, Cheng, & Huang, 2011 ; Nadolny & Halabi, 2016 ). Researchers found that incorporating gamified elements into courses, such as giving students digital rewards (e.g., redeemable points, trophies, and badges) for participating in learning activities or creating competition through the use of leaderboards where students can see how they rank against other students positively affects student motivation to complete learning tasks (Armier et al., 2016 ; Chang & Wei, 2016 ; Hew et al., 2016 ; Nadolny & Halabi, 2016 ). In addition, students who participated in gamified elements, such as trying to earn digital badges, were more motivated to complete particularly difficult learning activities (Hew et al., 2016 ) and showed persistence in exceeding learning requirements (Ibáñez et al., 2014 ). Research on emotional engagement may help to explain these findings. Studies suggest that digital games positively affect student attitudes about learning, evident in student reports that games are fun, interesting, and enjoyable (Beckem & Watkins, 2012 ; Farley, 2013 ; Grimley et al., 2012 ; Hew et al., 2016 ; Liu et al., 2011 ; Zimmermann, 2013 ), which may account for higher levels of student motivation in courses that offered digital games.

Research on digital games and behavioral engagement is more limited, with only one study suggesting that games lead to greater participation in educational activities (Hew et al., 2016 ). Therefore, more research is needed to explore how digital games may influence behavioral engagement. In addition, research is needed to determine whether the underlying technology associated with digital games (e.g., computer-based simulations and virtual realities) produce positive engagement outcomes or whether common mechanisms associated with both digital and non-digital games (e.g., role play, rewards, and competition) account for those outcomes. For example, studies in which non-digital, face-to-face games were used also showed positive effects on student engagement (Antunes, Pacheco, & Giovanela, 2012 ; Auman, 2011 ; Coffey, Miller, & Feuerstein, 2011 ; Crocco, Offenholley, & Hernandez, 2016 ; Poole, Kemp, Williams, & Patterson, 2014 ; Scarlet & Ampolos, 2013 ); therefore, it is unclear if and how digitizing games contributes to student engagement.

Discussion and implications

Student engagement is linked to a number of academic outcomes, such as retention, grade point average, and graduation rates (Carini et al., 2006 ; Center for Postsecondary Research, 2016 ; Hu & McCormick, 2012 ). As a result, universities have shown a strong interest in how to increase student engagement, particularly given rising external pressures to improve learning outcomes and prepare students for academic success (Axelson & Flick, 2011 ; Kuh, 2009 ). There are various models of student engagement that identify factors that influence student engagement (Kahu, 2013 ; Lam et al., 2012 ; Nora et al., 2005 ; Wimpenny & Savin-Baden, 2013 ; Zepke & Leach, 2010 ); however, none include the overt role of technology despite the growing trend and student demands to integrate technology into the learning experience (Amirault, 2012 ; Cook & Sonnenberg, 2014 ; Revere & Kovach, 2011 ; Sun & Chen, 2016 ; Westera, 2015 ). Therefore, the primary purpose of our literature review was to explore whether technology influences student engagement. The secondary purpose was to address skepticism and uncertainty about pedagogical benefits of technology (Ashrafzadeh & Sayadian, 2015 ; Kopcha et al., 2016 ; Reid, 2014 ) by reviewing the literature regarding the efficacy of specific technologies (i.e., web-conferencing software, blogs, wikis, social networking sites, and digital games) for promoting student engagement and offering recommendations for effective implementation, which are included at the end of this paper. In the sections that follow, we provide an overview of the findings, an explanation of existing methodological limitations and areas for future research, and a list of best practices for integrating the technologies we reviewed into the teaching and learning process.

Summary of findings

Findings from our literature review provide preliminary support for including technology as a factor that influences student engagement in existing models (Table 1 ). One overarching theme is that most of the technologies we reviewed had a positive influence on multiple indicators of student engagement, which may lead to a larger return on investment in terms of learning outcomes. For example, digital games influence all three types of student engagement and six of the seven indicators we identified, surpassing the other technologies in this review. There were several key differences in the design and pedagogical use between digital games and other technologies that may explain these findings. First, digital games were designed to provide authentic learning contexts in which students could practice skills and apply learning (Beckem & Watkins, 2012 ; Farley, 2013 ; Grimley et al., 2012 ; Ke et al., 2016 ; Liu et al., 2011 ; Lu et al., 2014 ; Marriott et al., 2015 ; Siddique et al., 2013 ), which is consistent with experiential learning and adult learning theories. Experiential learning theory suggests that learning occurs through interaction with one’s environment (Kolb, 2014 ) while adult learning theory suggests that adult learners want to be actively involved in the learning process and be able apply learning to real life situations and problems (Cercone, 2008 ). Second, students reported that digital games (and gamified elements) are fun, enjoyable, and interesting (Beckem & Watkins, 2012 ; Farley, 2013 ; Grimley et al., 2012 ; Hew et al., 2016 ; Liu et al., 2011 ; Zimmermann, 2013 ), feelings that are associated with a flow-like state where one is completely immersed in and engaged with the activity (Csikszentmihalyi, 1988 ; Weibel, Wissmath, Habegger, Steiner, & Groner, 2008 ). Third, digital games were closely integrated into the curriculum as required activities (Farley, 2013 ; Grimley et al., 2012 , Ke et al., 2016 ; Liu et al., 2011 ; Marriott et al., 2015 ; Siddique et al., 2013 ) as opposed to wikis, Facebook , and Twitter , which were often voluntary and used to supplement lectures (Dougherty & Andercheck, 2014 Nakamaru, 2012 ; Prestridge, 2014 ; Rambe, 2012 ).

Web-conferencing software and Facebook also yielded the most positive findings, influencing four of the seven indicators of student engagement, compared to other collaborative technologies, such as blogs, wikis, and Twitter . Web-conferencing software was unique due to the sheer number of collaborative features it offers, providing multiple ways for students to actively engage with course content (screen sharing, whiteboards, digital pens) and interact with peers and the instructor (audio, video, text chats, breakout rooms) (Bower, 2011 ; Hudson et al., 2012 ; Martin et al., 2012 ; McBrien et al., 2009 ); this may account for the effects on multiple indicators of student engagement. Positive findings regarding Facebook ’s influence on student engagement could be explained by a strong familiarity and preference for the social networking site (Clements, 2015 ; DiVall & Kirwin, 2012 ; Hurt et al., 2012 ; Hou et al., 2015 ; Kent, 2013 ; Manca & Ranieri, 2013 ), compared to Twitter which was less familiar or interesting to students (Bista, 2015 ; McKay et al., 2014 ; Mysko & Delgaty, 2015 ; Osgerby & Rush, 2015 ; Tiernan, 2014 ). Wikis had the lowest influence on student engagement, with mixed findings regarding behavioral engagement, limited, but conclusive findings, regarding one indicator of cognitive engagement (deep processing of information), and no studies pertaining to other indicators of cognitive engagement (motivation, persistence) or emotional engagement.

Another theme that arose was the prevalence of mixed findings across multiple technologies regarding behavioral engagement. Overall, the vast majority of studies addressed behavioral engagement, and we expected that technologies designed specifically for social interaction, such as web-conferencing, wikis, and social networking sites, would yield more conclusive findings. However, one possible reason for the mixed findings may be that the technologies were voluntary in many studies, resulting in lower than desired participation rates and missed opportunities for interaction (Armstrong & Thornton, 2012 ; Fagioli et al., 2015 ; Nakamaru, 2012 ; Rambe, 2012 ; Ross et al., 2015 ; Williams & Whiting, 2016 ), and mandatory in a few studies, yielding higher levels of participation and interaction (Bahati, 2015 ; Gagnon, 2015 ; Roussinos & Jimoyiannis, 2013 ). Another possible reason for the mixed findings is that measures of variables differed across studies. For example, in some studies participation meant that a student signed up for a Twitter account (Tiernan, 2014 ), used the Twitter account for class (Williams & Whiting, 2016 ), or viewed the course-specific Twitter page (Hennessy et al., 2016 ). The pedagogical uses of the technologies also varied considerably across studies, making it difficult to make comparisons. For example, Facebook was used in studies to share learning materials (Clements, 2015 ; Dyson et al., 2015 ), answer student questions about academic content or administrative issues (Rambe, 2012 ), prepare for upcoming exams and share study tips (Bowman & Akcaoglu, 2014 ; DiVall & Kirwin, 2012 ), complete group work (Hou et al., 2015 ; Staines & Lauchs, 2013 ), and discuss course content (Camus et al., 2016 ; Kent, 2013 ; Hurt et al., 2012 ). Finally, cognitive indicators (motivation and persistence) drew the fewest amount of studies, which suggests that research is needed to determine whether technologies affect these indicators.

Methodological limitations

While there appears to be preliminary support for the use of many of the technologies to promote student engagement, there are significant methodological limitations in the literature and, as a result, findings should be interpreted with caution. First, many studies used small sample sizes and were limited to one course, one degree level, and one university. Therefore, generalizability is limited. Second, very few studies used experimental or quasi-experimental designs; therefore, very little evidence exists to substantiate a cause and effect relationship between technologies and student engagement indicators. In addition, in many studies that did use experimental or quasi-experimental designs, participants were not randomized; rather, participants who volunteered to use a specific technology were compared to those who chose not to use the technology. As a result, there is a possibility that fundamental differences between users and non-users could have affected the engagement results. Furthermore, many of the studies did not isolate specific technological features (e.g, using only the breakout rooms for group work in web-conferencing software, rather than using the chat feature, screen sharing, and breakout rooms for group work). Using multiple features at once could have conflated student engagement results. Third, many studies relied on one source to measure technological and engagement variables (single source bias), such as self-report data (i.e., reported usage of technology and perceptions of student engagement), which may have affected the validity of the results. Fourth, many studies were conducted during a very brief timeframe, such as one academic term. As a result, positive student engagement findings may be attributed to a “novelty effect” (Dichev & Dicheva, 2017 ) associated with using a new technology. Finally, many studies lack adequate details about learning activities, raising questions about whether poor instructional design may have adversely affected results. For example, an instructor may intend to elicit higher-order thinking from students, but if learning activity instructions are written using low-level verbs, such as identify, describe, and summarize, students will be less likely to engage in higher-order thinking.

Areas for future research

The findings of our literature review suggest that the influence of technology on student engagement is still a developing area of knowledge that requires additional research to build on promising, but limited, evidence, clarify mixed findings, and address several gaps in the literature. As such, our recommendations for future areas of research are as follows:

Examine the effect of collaborative technologies (i.e., web-conferencing, blogs, wikis, social networking sites ) on emotional and cognitive student engagement. There are significant gaps in the literature regarding whether these technologies affect attitudes, interests, and values about learning; a sense of belonging within a learning community; motivation to learn; and persistence to overcome academic challenges and meet or exceed requirements.

Clarify mixed findings, particularly regarding how web-conferencing software, wikis, and Facebook and Twitter affect participation in learning activities. Researchers should make considerable efforts to gain consensus or increase consistency on how participation is measured (e.g., visited Facebook group or contributed one post a week) in order to make meaningful comparisons and draw conclusions about the efficacy of various technologies for promoting behavioral engagement. In addition, further research is needed to clarify findings regarding how wikis and Twitter influence interaction and how blogs and Facebook influence deep processing of information. Future research studies should include justifications for the pedagogical use of specific technologies and detailed instructions for learning activities to minimize adverse findings from poor instructional design and to encourage replication.

Conduct longitudinal studies over several academic terms and across multiple academic disciplines, degree levels, and institutions to determine long-term effects of specific technologies on student engagement and to increase generalizability of findings. Also, future studies should take individual factors into account, such as gender, age, and prior experience with the technology. Studies suggest that a lack of prior experience or familiarity with Twitter was a barrier to Twitter use in educational settings (Bista, 2015 , Mysko & Delgaty, 2015 , Tiernan, 2014 ); therefore, future studies should take prior experience into account.

Compare student engagement outcomes between and among different technologies and non-technologies. For example, studies suggest that students prefer Facebook over Twitter (Bista, 2015 ; Osgerby & Rush, 2015 ), but there were no studies that compared these technologies for promoting student engagement. Also, studies are needed to isolate and compare different features within the same technology to determine which might be most effective for increasing engagement. Finally, studies on digital games (Beckem & Watkins, 2012 ; Grimley et al., 2012 ; Ke et al., 2016 ; Lu et al., 2014 ; Marriott et al., 2015 ; Siddique et al., 2013 ) and face-to-face games (Antunes et al., 2012 ; Auman, 2011 ; Coffey et al., 2011 ; Crocco et al., 2016 ; Poole et al., 2014 ; Scarlet & Ampolos, 2013 ) show similar, positive effects on student engagement, therefore, additional research is needed to determine the degree to which the delivery method (i.e.., digital versus face-to-face) accounts for positive gains in student engagement.

Determine whether other technologies not included in this review influence student engagement. Facebook and Twitter regularly appear in the literature regarding social networking, but it is unclear how other popular social networking sites, such as LinkedIn, Instagram, and Flickr, influence student engagement. Future research should focus on the efficacy of these and other popular social networking sites for promoting student engagement. In addition, there were very few studies about whether informational technologies, which involve the one-way transmission of information to students, affect different types of student engagement. Future research should examine whether informational technologies, such as video lectures, podcasts, and pre-recorded narrated Power Point presentations or screen casts, affect student engagement. Finally, studies should examine the influence of mobile software and technologies, such as educational apps or smartphones, on student engagement.

Achieve greater consensus on the meaning of student engagement and its distinction from similar concepts in the literature, such as social and cognitive presence (Garrison & Arbaugh, 2007 )

Recommendations for practice

Despite the existing gaps and mixed findings in the literature, we were able to compile a list of recommendations for when and how to use technology to increase the likelihood of promoting student engagement. What follows is not an exhaustive list; rather, it is a synthesis of both research findings and lessons learned from the studies we reviewed. There may be other recommendations to add to this list; however, our intent is to provide some useful information to help address barriers to technology integration among faculty who feel uncertain or unprepared to use technology (Ashrafzadeh & Sayadian, 2015 ; Hauptman, 2015 ; Kidd et al., 2016 ; Reid, 2014 ) and to add to the body of practical knowledge in instructional design and delivery. Our recommendations for practice are as follows:

Consider context before selecting technologies. Contextual factors such as existing technological infrastructure and requirements, program and course characteristics, and the intended audience will help determine which technologies, if any, are most appropriate (Bullen & Morgan, 2011 ; Bullen, Morgan, & Qayyum, 2011 ). For example, requiring students to use a blog that is not well integrated with the existing LMS may prove too frustrating for both the instructor and students. Similarly, integrating Facebook- and Twitter- based learning activities throughout a marketing program may be more appropriate, given the subject matter, compared to doing so in an engineering or accounting program where social media is less integral to the profession. Finally, do not assume that students appreciate or are familiar with all technologies. For example, students who did not already have Facebook or Twitter accounts were less likely to use either for learning purposes and perceived setting up an account to be an increase in workload (Bista, 2015 , Clements, 2015 ; DiVall & Kirwin, 2012 ; Hennessy et al., 2016 ; Mysko & Delgaty, 2015 , Tiernan, 2014 ). Therefore, prior to using any technology, instructors may want to determine how many students already have accounts and/or are familiar with the technology.

Carefully select technologies based on their strengths and limitations and the intended learning outcome. For example, Twitter is limited to 140 characters, making it a viable tool for learning activities that require brevity. In one study, an instructor used Twitter for short pop quizzes during lectures, where the first few students to tweet the correct answer received additional points (Kim et al., 2015 ), which helped students practice applying knowledge. In addition, studies show that students perceive Twitter and Facebook to be primarily for social interactions (Camus et al., 2016 ; Ross et al., 2015 ), which may make these technologies viable tools for sharing resources, giving brief opinions about news stories pertaining to course content, or having casual conversations with classmates rather than full-fledged scholarly discourse.

Incentivize students to use technology, either by assigning regular grades or giving extra credit. The average participation rates in voluntary web-conferencing, Facebook , and Twitter learning activities in studies we reviewed was 52% (Andrew et al., 2015 ; Armstrong & Thornton, 2012 ; Bahati, 2015 ; Bowman & Akcaoglu, 2014 ; Divall & Kirwin, 2012 ; Dougherty & Andercheck, 2014 ; Fagioli et al., 2015 ; Hennessy et al., 2016 ; Junco et al., 2013 ; Rambe, 2012 ; Ross et al., 2015 ; Staines & Lauchs, 2013 ; Tiernan, 2014 ; Williams & Whiting, 2016 ). While there were far fewer studies on the use of technology for graded or mandatory learning activities, the average participation rate reported in those studies was 97% (Bahati2015; Gagnon, 2015 ), suggesting that grading may be a key factor in ensuring students participate.

Communicate clear guidelines for technology use. Prior to the implementation of technology in a course, students may benefit from an overview the technology, including its navigational features, privacy settings, and security (Andrew et al., 2015 ; Hurt et al., 2012 ; Martin et al., 2012 ) and a set of guidelines for how to use the technology effectively and professionally within an educational setting (Miller et al., 2012 ; Prestridge, 2014 ; Staines & Lauchs, 2013 ; West et al., 2015 ). In addition, giving students examples of exemplary and poor entries and posts may also help to clarify how they are expected to use the technology (Shraim, 2014 ; Roussinos & Jimoyiannis, 2013 ). Also, if instructors expect students to use technology to demonstrate higher-order thinking or to interact with peers, there should be explicit instructions to do so. For example, Prestridge ( 2014 ) found that students used Twitter to ask the instructor questions but very few interacted with peers because they were not explicitly asked to do so. Similarly, Hou et al., 2015 reported low levels of knowledge construction in Facebook , admitting that the wording of the learning activity (e.g., explore and present applications of computer networking) and the lack of probing questions in the instructions may have been to blame.

Use technology to provide authentic and integrated learning experiences. In many studies, instructors used digital games to simulate authentic environments in which students could apply new knowledge and skills, which ultimately lead to a greater understanding of content and evidence of higher-order thinking (Beckem & Watkins, 2012 ; Liu et al., 2011 ; Lu et al., 2014 ; Marriott et al., 2015 ; Siddique et al., 2013 ). For example, in one study, students were required to play the role of a stock trader in a simulated trading environment and they reported that the simulation helped them engage in critical reflection, enabling them to identify their mistakes and weaknesses in their trading approaches and strategies (Marriott et al., 2015 ). In addition, integrating technology into regularly-scheduled classroom activities, such as lectures, may help to promote student engagement. For example, in one study, the instructor posed a question in class, asked students to respond aloud or tweet their response, and projected the Twitter page so that everyone could see the tweets in class, which lead to favorable comments about the usefulness of Twitter to promote engagement (Tiernan, 2014 ).

Actively participate in using the technologies assigned to students during the first few weeks of the course to generate interest (Dougherty & Andercheck, 2014 ; West et al., 2015 ) and, preferably, throughout the course to answer questions, encourage dialogue, correct misconceptions, and address inappropriate behavior (Bowman & Akcaoglu, 2014 ; Hennessy et al., 2016 ; Junco et al., 2013 ; Roussinos & Jimoyiannis, 2013 ). Miller et al. ( 2012 ) found that faculty encouragement and prompting was associated with increases in students’ expression of ideas and the degree to which they edited and elaborated on their peers’ work in a course-specific wiki.

Be mindful of privacy, security, and accessibility issues. In many studies, instructors took necessary steps to help ensure privacy and security by creating closed Facebook groups and private Twitter pages, accessible only to students in the course (Bahati, 2015 ; Bista, 2015 ; Bowman & Akcaoglu, 2014 ; Esteves, 2012 ; Rambe, 2012 ; Tiernan, 2014 ; Williams & Whiting, 2016 ) and by offering training to students on how to use privacy and security settings (Hurt et al., 2012 ). Instructors also made efforts to increase accessibility of web-conferencing software by including a phone number for students unable to access audio or video through their computer and by recording and archiving sessions for students unable to attend due to pre-existing conflicts (Andrew et al., 2015 ; Martin et al., 2012 ). In the future, instructors should also keep in mind that some technologies, like Facebook and Twitter , are not accessible to students living in China; therefore, alternative arrangements may need to be made.

In 1985, Steve Jobs predicted that computers and software would revolutionize the way we learn. Over 30 years later, his prediction has yet to be fully confirmed in the student engagement literature; however, our findings offer preliminary evidence that the potential is there. Of the technologies we reviewed, digital games, web-conferencing software, and Facebook had the most far-reaching effects across multiple types and indicators of student engagement, suggesting that technology should be considered a factor that influences student engagement in existing models. Findings regarding blogs, wikis, and Twitter, however, are less convincing, given a lack of studies in relation to engagement indicators or mixed findings. Significant methodological limitations may account for the wide range of findings in the literature. For example, small sample sizes, inconsistent measurement of variables, lack of comparison groups, and missing details about specific, pedagogical uses of technologies threaten the validity and reliability of findings. Therefore, more rigorous and robust research is needed to confirm and build upon limited but positive findings, clarify mixed findings, and address gaps particularly regarding how different technologies influence emotional and cognitive indicators of engagement.

Abbreviations

Learning management system

Amirault, R. J. (2012). Distance learning in the 21 st century university. Quarterly Review of Distance Education, 13 (4), 253–265.

Google Scholar  

Anderson, M. (2016). More Americans using smartphones for getting directions, streaming TV . Washington, D.C.: Pew Research Center Retrieved from http://www.pewresearch.org/fact-tank/2016/01/29/us-smartphone-use/ .

Anderson, M., & Horrigan, J. B. (2016). Smartphones help those without broadband get online, but don’t necessary bridge the digital divide . Washington, D.C.: Pew Research Center Retrieved from http://www.pewresearch.org/fact-tank/2016/10/03/smartphones-help-those-without-broadband-get-online-but-dont-necessarily-bridge-the-digital-divide/ .

Andrew, L., Maslin-Prothero, S., & Ewens, B. (2015). Enhancing the online learning experience using virtual interactive classrooms. Australian Journal of Advanced Nursing, 32 (4), 22–31.

Antunes, M., Pacheco, M. R., & Giovanela, M. (2012). Design and implementation of an educational game for teaching chemistry in higher education. Journal of Chemical Education, 89 (4), 517–521. doi: 10.1021/ed2003077 .

Article   Google Scholar  

Armier, D. J., Shepherd, C. E., & Skrabut, S. (2016). Using game elements to increase student engagement in course assignments. College Teaching, 64 (2), 64–72 https://doi.org/10.1080/87567555.2015.1094439 .

Armstrong, A., & Thornton, N. (2012). Incorporating Brookfield’s discussion techniques synchronously into asynchronous online courses. Quarterly Review of Distance Education, 13 (1), 1–9.

Ashrafzadeh, A., & Sayadian, S. (2015). University instructors’ concerns and perceptions of technology integration. Computers in Human Behavior, 49 , 62–73. doi: 10.1016/j.chb.2015.01.071 .

Astin, A. W. (1984). Student involvement: A developmental theory for higher education. Journal of College Student Personnel, 25 (4), 297–308.

Auman, C. (2011). Using simulation games to increase student and instructor engagement. College Teaching, 59 (4), 154–161. doi: 10.1080/87567555 .

Axelson, R. D., & Flick, A. (2011). Defining student engagement. Change: The magazine of higher learning, 43 (1), 38–43.

Bahati, B. (2015). Extending student discussions beyond lecture room walls via Facebook. Journal of Education and Practice, 6 (15), 160–171.

Bakker, A. B., Vergel, A. I. S., & Kuntze, J. (2015). Student engagement and performance: A weekly diary study on the role of openness. Motivation and Emotion, 39 (1), 49–62. doi: 10.1007/s11031-014-9422-5 .

Beckem, J. I., & Watkins, M. (2012). Bringing life to learning: Immersive experiential learning simulations for online and blended courses. Journal if Asynchronous Learning Networks, 16 (5), 61–70 https://doi.org/10.24059/olj.v16i5.287 .

Bista, K. (2015). Is Twitter an effective pedagogical tool in higher education? Perspectives of education graduate students. Journal of the Scholarship Of Teaching And Learning, 15 (2), 83–102 https://doi.org/10.14434/josotl.v15i2.12825 .

Boghossian, P. (2006). Behaviorism, constructivism, and Socratic pedagogy. Educational Philosophy and Theory, 38 (6), 713–722 https://doi.org/10.1111/j.1469-5812.2006.00226.x .

Bower, M. (2011). Redesigning a web-conferencing environment to scaffold computing students’ creative design processes. Journal of Educational Technology & Society, 14 (1), 27–42.

MathSciNet   Google Scholar  

Bower, M. (2016). A framework for adaptive learning design in a Web-conferencing environment. Journal of Interactive Media in Education, 2016 (1), 11 http://doi.org/10.5334/jime.406 .

Article   MathSciNet   Google Scholar  

Bowman, N. D., & Akcaoglu, M. (2014). “I see smart people!”: Using Facebook to supplement cognitive and affective learning in the university mass lecture. The Internet and Higher Education, 23 , 1–8. doi: 10.1016/j.iheduc.2014.05.003 .

Boyle, E. A., Hainey, T., Connolly, T. M., Gray, G., Earp, J., Ott, M., et al. (2016). An update to the systematic literature review of empirical evidence of the impacts and outcomes of computer games and serious games. Computers & Education, 94 , 178–192. doi: 10.1016/j.compedu.2015.11.003 .

Bryson, C., & Hand, L. (2007). The role of engagement in inspiring teaching and learning. Innovations in Education and Teaching International, 44 (4), 349–362. doi: 10.1080/14703290701602748 .

Buchanan, T., Sainter, P., & Saunders, G. (2013). Factors affecting faculty use of learning technologies: Implications for models of technology adoption. Journal of Computer in Higher Education, 25 (1), 1–11.

Bullen, M., & Morgan, T. (2011). Digital learners not digital natives. La Cuestión Universitaria, 7 , 60–68.

Bullen, M., Morgan, T., & Qayyum, A. (2011). Digital learners in higher education: Generation is not the issue. Canadian Journal of Learning and Technology, 37 (1), 1–24.

Calabretto, J., & Rao, D. (2011). Wikis to support collaboration of pharmacy students in medication management workshops -- a pilot project. International Journal of Pharmacy Education & Practice, 8 (2), 1–12.

Camacho, M. E., Carrión, M. D., Chayah, M., & Campos, J. M. (2016). The use of wiki to promote students’ learning in higher education (Degree in Pharmacy). International Journal of Educational Technology in Higher Education, 13 (1), 1–8 https://doi.org/10.1186/s41239-016-0025-y .

Camus, M., Hurt, N. E., Larson, L. R., & Prevost, L. (2016). Facebook as an online teaching tool: Effects on student participation, learning, and overall course performance. College Teaching, 64 (2), 84–94 https://doi.org/10.1080/87567555.2015.1099093 .

Carini, R. M., Kuh, G. D., & Klein, S. P. (2006). Student engagement and student learning: Testing the linkages. Research in Higher Education, 47 (1), 1–32. doi: 10.1007/s11162-005-8150-9 .

Cassidy, E. D., Colmenares, A., Jones, G., Manolovitz, T., Shen, L., & Vieira, S. (2014). Higher Education and Emerging Technologies: Shifting Trends in Student Usage. The Journal of Academic Librarianship, 40 , 124–133. doi: 10.1016/j.acalib.2014.02.003 .

Center for Postsecondary Research (2016). Engagement insights: Survey findings on the quality of undergraduate education . Retrieved from http://nsse.indiana.edu/NSSE_2016_Results/pdf/NSSE_2016_Annual_Results.pdf .

Center for Postsecondary Research (2017). About NSSE. Retrieved on February 15, 2017 from http://nsse.indiana.edu/html/about.cfm

Cercone, K. (2008). Characteristics of adult learners with implications for online learning design. AACE Journal, 16 (2), 137–159.

Chang, J. W., & Wei, H. Y. (2016). Exploring Engaging Gamification Mechanics in Massive Online Open Courses. Educational Technology & Society, 19 (2), 177–203.

Chawinga, W. D. (2017). Taking social media to a university classroom: teaching and learning using Twitter and blogs. International Journal of Educational Technology in Higher Education, 14 (1), 3 https://doi.org/10.1186/s41239-017-0041-6 .

Chen, B., Seilhamer, R., Bennett, L., & Bauer, S. (2015). Students’ mobile learning practices in higher education: A multi-year study. In EDUCAUSE Review Retrieved from http://er.educause.edu/articles/2015/6/students-mobile-learning-practices-in-higher-education-a-multiyear-study .

Chu, S. K., Chan, C. K., & Tiwari, A. F. (2012). Using blogs to support learning during internship. Computers & Education, 58 (3), 989–1000. doi: 10.1016/j.compedu.2011.08.027 .

Clements, J. C. (2015). Using Facebook to enhance independent student engagement: A case study of first-year undergraduates. Higher Education Studies, 5 (4), 131–146 https://doi.org/10.5539/hes.v5n4p131 .

Coates, H. (2008). Attracting, engaging and retaining: New conversations about learning . Camberwell: Australian Council for Educational Research Retrieved from http://research.acer.edu.au/cgi/viewcontent.cgi?article=1015&context=ausse .

Coffey, D. J., Miller, W. J., & Feuerstein, D. (2011). Classroom as reality: Demonstrating campaign effects through live simulation. Journal of Political Science Education, 7 (1), 14–33.

Coghlan, E., Crawford, J. Little, J., Lomas, C., Lombardi, M., Oblinger, D., & Windham, C. (2007). ELI Discovery Tool: Guide to Blogging . Retrieved from https://net.educause.edu/ir/library/pdf/ELI8006.pdf .

Connolly, T. M., Boyle, E. A., MacArthur, E., Hainey, T., & Boyle, J. M. (2012). A systematic literature review of empirical evidence on computer games and serious games. Computers & Education, 59 , 661–686. doi: 10.1016/j.compedu.2012.03.004 .

Cook, C. W., & Sonnenberg, C. (2014). Technology and online education: Models for change. ASBBS E-Journal, 10 (1), 43–59.

Crocco, F., Offenholley, K., & Hernandez, C. (2016). A proof-of-concept study of game-based learning in higher education. Simulation & Gaming, 47 (4), 403–422. doi: 10.1177/1046878116632484 .

Csikszentmihalyi, M. (1988). The flow experience and its significance for human psychology. In M. Csikszentmihalyi & I. Csikszentmihalyi (Eds.), Optimal experience: Psychological studies of flow in consciousness (pp. 15–13). Cambridge, UK: Cambridge University Press.

Chapter   Google Scholar  

Dahlstrom, E. (2012). ECAR study of undergraduate students and information technology, 2012 (Research Report). Retrieved from http://net.educause.edu/ir/library/pdf/ERS1208/ERS1208.pdf

de Freitas, S. (2006). Learning in immersive worlds: A review of game-based learning . Retrieved from https://curve.coventry.ac.uk/open/file/aeedcd86-bc4c-40fe-bfdf-df22ee53a495/1/learning%20in%20immersive%20worlds.pdf .

Dichev, C., & Dicheva, D. (2017). Gamifying education: What is known, what is believed and what remains uncertain: A critical review. International Journal of Educational Technology in Higher Education, 14 (9), 1–36. doi: 10.1186/s41239-017-0042-5 .

DiVall, M. V., & Kirwin, J. L. (2012). Using Facebook to facilitate course-related discussion between students and faculty members. American Journal of Pharmaceutical Education, 76 (2), 1–5 https://doi.org/10.5688/ajpe76232 .

Dos, B., & Demir, S. (2013). The analysis of the blogs created in a blended course through the reflective thinking perspective. Educational Sciences: Theory & Practice, 13 (2), 1335–1344.

Dougherty, K., & Andercheck, B. (2014). Using Facebook to engage learners in a large introductory course. Teaching Sociology, 42 (2), 95–104 https://doi.org/10.1177/0092055x14521022 .

Dyson, B., Vickers, K., Turtle, J., Cowan, S., & Tassone, A. (2015). Evaluating the use of Facebook to increase student engagement and understanding in lecture-based classes. Higher Education: The International Journal of Higher Education and Educational Planning, 69 (2), 303–313 https://doi.org/10.1007/s10734-014-9776-3.

Esteves, K. K. (2012). Exploring Facebook to enhance learning and student engagement: A case from the University of Philippines (UP) Open University. Malaysian Journal of Distance Education, 14 (1), 1–15.

Evans, C. (2014). Twitter for teaching: Can social media be used to enhance the process of learning? British Journal of Educational Technology, 45 (5), 902–915 https://doi.org/10.1111/bjet.12099 .

Fagioli, L., Rios-Aguilar, C., & Deil-Amen, R. (2015). Changing the context of student engagement: Using Facebook to increase community college student persistence and success. Teachers College Record, 17 , 1–42.

Farley, P. C. (2013). Using the computer game “FoldIt” to entice students to explore external representations of protein structure in a biochemistry course for nonmajors. Biochemistry and Molecular Biology Education, 41 (1), 56–57 https://doi.org/10.1002/bmb.20655 .

Francescucci, A., & Foster, M. (2013). The VIRI classroom: The impact of blended synchronous online courses on student performance, engagement, and satisfaction. Canadian Journal of Higher Education, 43 (3), 78–91.

Fredricks, J., Blumenfeld, P., & Paris, A. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74 (1), 59–109. doi: 10.3102/00346543074001059 .

Gagnon, K. (2015). Using twitter in health professional education: A case study. Journal of Allied Health, 44 (1), 25–33.

Gandhi, P., Khanna, S., & Ramaswamy, S. (2016). Which industries are the most digital (and why?) . Retrieved from https://hbr.org/2016/04/a-chart-that-shows-which-industries-are-the-most-digital-and-why .

Garrison, D. R., & Arbaugh, J. B. (2007). Researching the community of inquiry framework: Review, issues, and future directions. The Internet and Higher Education, 10 (3), 157–172 http://dx.doi.org/10.1016/j.iheduc.2007.04.001 .

Garrity, M. K., Jones, K., VanderZwan, K. J., de la Rocha, A. R., & Epstein, I. (2014). Integrative review of blogging: Implications for nursing education. Journal of Nursing Education, 53 (7), 395–401. doi: 10.3928/01484834-20140620-01 .

Gikas, J., & Grant, M. M. (2013). Mobile computing devices in higher education: Student perspectives on learning with cellphones, smartphones & social media. The Internet and Higher Education, 19 , 18–26 http://dx.doi.org/10.1016/j.iheduc.2013.06.002 .

Gilboy, M. B., Heinerichs, S., & Pazzaglia, G. (2015). Enhancing student engagement using the flipped classroom. Journal of Nutrition Education and Behavior, 47 (1), 109–114 http://dx.doi.org/10.1016/j.jneb.2014.08.008 .

Greenwood, S., Perrin, A., & Duggan, M. (2016). Social media update 2016 . Washington.: Pew Research Center Retrieved from http://www.pewinternet.org/2016/11/11/social-media-update-2016/ .

Grimley, M., Green, R., Nilsen, T., & Thompson, D. (2012). Comparing computer game and traditional lecture using experience ratings from high and low achieving students. Australasian Journal of Educational Technology, 28 (4), 619–638 https://doi.org/10.14742/ajet.831 .

Gunawardena, C. N., Hermans, M. B., Sanchez, D., Richmond, C., Bohley, M., & Tuttle, R. (2009). A theoretical framework for building online communities of practice with social networking tools. Educational Media International, 46 (1), 3–16 https://doi.org/10.1080/09523980802588626 .

Haggis, T. (2009). What have we been thinking of? A critical overview of 40 years of student learning research in higher education. Studies in Higher Education, 34 (4), 377–390. doi: 10.1080/03075070902771903 .

Hauptman, P.H. (2015). Mobile technology in college instruction. Faculty perceptions and barriers to adoption (Doctoral dissertation). Retrieved from ProQuest. (AAI3712404).

Hennessy, C. M., Kirkpatrick, E., Smith, C. F., & Border, S. (2016). Social media and anatomy education: Using twitter to enhance the student learning experience in anatomy. Anatomical Sciences Education, 9 (6), 505–515 https://doi.org/10.1002/ase.1610 .

Hew, K. F., Huang, B., Chu, K. S., & Chiu, D. K. (2016). Engaging Asian students through game mechanics: Findings from two experiment studies. Computers & Education, 93 , 221–236. doi: 10.1016/j.compedu.2015.10.010 .

Hewege, C. R., & Perera, L. R. (2013). Pedagogical significance of wikis: Towards gaining effective learning outcomes. Journal of International Education in Business, 6 (1), 51–70 https://doi.org/10.1108/18363261311314953 .

Hou, H., Wang, S., Lin, P., & Chang, K. (2015). Exploring the learner’s knowledge construction and cognitive patterns of different asynchronous platforms: comparison of an online discussion forum and Facebook. Innovations in Education and Teaching International, 52 (6), 610–620. doi: 10.1080/14703297.2013.847381 .

Hu, S., & McCormick, A. C. (2012). An engagement-based student typology and its relationship to college outcomes. Research in Higher Education, 53 , 738–754. doi: 10.1007/s11162-012-9254-7 .

Hudson, T. M., Knight, V., & Collins, B. C. (2012). Perceived effectiveness of web conferencing software in the digital environment to deliver a graduate course in applied behavior analysis. Rural Special Education Quarterly, 31 (2), 27–39.

Hurt, N. E., Moss, G. S., Bradley, C. L., Larson, L. R., Lovelace, M. D., & Prevost, L. B. (2012). The ‘Facebook’ effect: College students’ perceptions of online discussions in the age of social networking. International Journal for the Scholarship of Teaching & Learning, 6 (2), 1–24 https://doi.org/10.20429/ijsotl.2012.060210 .

Ibáñez, M. B., Di-Serio, A., & Delgado-Kloos, C. (2014). Gamification for engaging computer science students in learning activities: A case study. IEEE Transactions on Learning Technologies, 7 (3), 291–301 https://doi.org/10.1109/tlt.2014.2329293 .

Ivala, E., & Gachago, D. (2012). Social media for enhancing student engagement: The use of facebook and blogs at a university of technology. South African Journal of Higher Education, 26 (1), 152–167.

Johnson, D. R. (2013). Technological change and professional control in the professoriate. Science, Technology & Human Values, 38 (1), 126–149. doi: 10.1177/0162243911430236 .

Junco, R., Elavsky, C. M., & Heiberger, G. (2013). Putting Twitter to the test: Assessing outcomes for student collaboration, engagement and success. British Journal of Educational Technology, 44 (2), 273–287. doi: 10.1111/j.1467-8535.2012.01284.x .

Junco, R., Heibergert, G., & Loken, E. (2011). The effect of Twitter on college student engagement and grades. Journal of Computer Assisted Learning, 27 (2), 119–132. doi: 10.1111/j.1365-2729.2010.00387.x .

Kahu, E. R. (2013). Framing student engagement in higher education. Studies in Higher Education, 38 (5), 758–773. doi: 10.1080/03075079.2011.598505 .

Kaware, S. S., & Sain, S. K. (2015). ICT Application in Education: An Overview. International Journal of Multidisciplinary Approach & Studies, 2 (1), 25–32.

Ke, F., Xie, K., & Xie, Y. (2016). Game-based learning engagement: A theory- and data-driven exploration. British Journal of Educational Technology, 47 (6), 1183–1201 https://doi.org/10.1111/bjet.12314 .

Kent, M. (2013). Changing the conversation: Facebook as a venue for online class discussion in higher education. Journal of Online Learning & Teaching, 9 (4), 546–565 https://doi.org/10.1353/rhe.2015.0000 .

Kidd, T., Davis, T., & Larke, P. (2016). Experience, adoption, and technology: Exploring the phenomenological experiences of faculty involved in online teaching at once school of public health. International Journal of E-Learning, 15 (1), 71–99.

Kim, Y., Jeong, S., Ji, Y., Lee, S., Kwon, K. H., & Jeon, J. W. (2015). Smartphone response system using twitter to enable effective interaction and improve engagement in large classrooms. IEEE Transactions on Education, 58 (2), 98–103 https://doi.org/10.1109/te.2014.2329651 .

Kinchin. (2012). Avoiding technology-enhanced non-learning. British Journal of Educational Technology, 43 (2), E43–E48.

Kolb, D. A. (2014). Experiential learning: Experience as the source of learning and development (2nd ed.). Upper Saddle River: Pearson Education, Inc..

Kopcha, T. J., Rieber, L. P., & Walker, B. B. (2016). Understanding university faculty perceptions about innovation in teaching and technology. British Journal of Educational Technology, 47 (5), 945–957. doi: 10.1111/bjet.12361 .

Krause, K., & Coates, H. (2008). Students’ engagement in first-year university. Assessment and Evaluation in Higher Education, 33 (5), 493–505. doi: 10.1080/02602930701698892 .

Kuh, G. D. (2009). The National Survey of Student Engagement: Conceptual and empirical foundations. New Directions for Institutional Research, 141 , 5–20.

Lam, S., Wong, B., Yang, H., & Yi, L. (2012). Understanding student engagement with a contextual model. In S. L. Christenson, A. L. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 403–419). New York: Springer.

Lawrence, B., & Lentle-Keenan, S. (2013). Teaching beliefs and practice, institutional context, and the uptake of Web-based technology. Distance Education, 34 (1), 4–20.

Leach, L. (2016). Enhancing student engagement in one institution. Journal of Further and Higher Education, 40 (1), 23–47.

Lester, D. (2013). A review of the student engagement literature. Focus on Colleges, Universities, and Schools, 7 (1), 1–8.

Lewis, C. C., Fretwell, C. E., Ryan, J., & Parham, J. B. (2013). Faculty use of established and emerging technologies in higher education: A unified theory of acceptance and use of technology perspective. International Journal of Higher Education, 2 (2), 22–34 http://dx.doi.org/10.5430/ijhe.v2n2p22 .

Lin, C., Singer, R., & Ha, L. (2010). Why university members use and resist technology? A structure enactment perspective. Journal of Computing in Higher Education, 22 (1), 38–59. doi: 10.1007/s12528-010-9028-1 .

Linder-VanBerschot, J. A., & Summers, L. L. (2015). Designing instruction in the face of technology transience. Quarterly Review of Distance Education, 16 (2), 107–118.

Liu, C., Cheng, Y., & Huang, C. (2011). The effect of simulation games on the learning of computational problem solving. Computers & Education, 57 (3), 1907–1918 https://doi.org/10.1016/j.compedu.2011.04.002 .

Lu, J., Hallinger, P., & Showanasai, P. (2014). Simulation-based learning in management education: A longitudinal quasi-experimental evaluation of instructional effectiveness. Journal of Management Development, 33 (3), 218–244. doi: 10.1108/JMD-11-2011-0115 .

Maben, S., Edwards, J., & Malone, D. (2014). Online engagement through Facebook groups in face-to-face undergraduate communication courses: A case study. Southwestern Mass Communication Journal, 29 (2), 1–27.

Manca, S., & Ranieri, M. (2013). Is it a tool suitable for learning? A critical review of the literature on Facebook as a technology-enhanced learning environment. Journal of Computer Assisted Learning, 29 (6), 487–504. doi: 10.1111/jcal.12007 .

Mansouri, S. A., & Piki, A. (2016). An exploration into the impact of blogs on students’ learning: Case studies in postgraduate business education. Innovations in Education And Teaching International, 53 (3), 260–273 http://dx.doi.org/10.1080/14703297.2014.997777 .

Marriott, P., Tan, S. W., & Marriot, N. (2015). Experiential learning – A case study of the use of computerized stock market trading simulation in finance education. Accounting Education, 24 (6), 480–497 http://dx.doi.org/10.1080/09639284.2015.1072728 .

Martin, F., Parker, M. A., & Deale, D. F. (2012). Examining interactivity in synchronous virtual classrooms. International Review of Research in Open and Distance Learning, 13 (3), 227–261.

Martin, K., Goldwasser, M., & Galentino, R. (2017). Impact of Cohort Bonds on Student Satisfaction and Engagement. Current Issues in Education, 19 (3), 1–14.

Martínez, A. A., Medina, F. X., Albalat, J. A. P., & Rubió, F. S. (2013). Challenges and opportunities of 2.0 tools for the interdisciplinary study of nutrition: The case of the Mediterranean Diet wiki. International Journal of Educational Technology in Higher Education, 10 (1), 210–225 https://doi.org/10.7238/rusc.v10i1.1341 .

McBrien, J. L., Jones, P., & Cheng, R. (2009). Virtual spaces: Employing a synchronous online classroom to facilitate student engagement in online learning. International Review of Research in Open and Distance Learning, 10 (3), 1–17 https://doi.org/10.19173/irrodl.v10i3.605 .

McClenney, K., Marti, C. N., & Adkins, C. (2012). Student engagement and student outcomes: Key findings from “CCSSE” validation research . Austin: Community College Survey of Student Engagement.

McKay, M., Sanko, J., Shekhter, I., & Birnbach, D. (2014). Twitter as a tool to enhance student engagement during an interprofessional patient safety course. Journal of Interprofessional Care, 28 (6), 565–567 https://doi.org/10.3109/13561820.2014.912618 .

Miller, A. D., Norris, L. B., & Bookstaver, P. B. (2012). Use of wikis in pharmacy hybrid elective courses. Currents in Pharmacy Teaching & Learning, 4 (4), 256–261. doi: 10.1016/j.cptl.2012.05.004 .

Morley, D. A. (2012). Enhancing networking and proactive learning skills in the first year university experience through the use of wikis. Nurse Education Today, 32 (3), 261–266.

Mysko, C., & Delgaty, L. (2015). How and why are students using Twitter for #meded? Integrating Twitter into undergraduate medical education to promote active learning. Annual Review of Education, Communication & Language Sciences, 12 , 24–52.

Nadolny, L., & Halabi, A. (2016). Student participation and achievement in a large lecture course with game-based learning. Simulation and Gaming, 47 (1), 51–72. doi: 10.1177/1046878115620388 .

Naghdipour, B., & Eldridge, N. H. (2016). Incorporating social networking sites into traditional pedagogy: A case of facebook. TechTrends, 60 (6), 591–597 http://dx.doi.org/10.1007/s11528-016-0118-4 .

Nakamaru, S. (2012). Investment and return: Wiki engagement in a “remedial” ESL writing course. Journal of Research on Technology in Education, 44 (4), 273–291.

Nelson, R. (2016). Apple’s app store will hit 5 million apps by 2020, more than doubling its current size . Retrieved from https://sensortower.com/blog/app-store-growth-forecast-2020 .

Nora, A., Barlow, E., & Crisp, G. (2005). Student persistence and degree attainment beyond the first year in college. In A. Seidman (Ed.), College Student Retention (pp. 129–154). Westport: Praeger Publishers.

Osgerby, J., & Rush, D. (2015). An exploratory case study examining undergraduate accounting students’ perceptions of using Twitter as a learning support tool. International Journal of Management Education, 13 (3), 337–348. doi: 10.1016/j.ijme.2015.10.002 .

Pace, C. R. (1980). Measuring the quality of student effort. Current Issues in Higher Education, 2 , 10–16.

Pace, C. R. (1984). Student effort: A new key to assessing quality . Los Angeles: University of California, Higher Education Research Institute.

Paul, J. A., & Cochran, J. D. (2013). Key interactions for online programs between faculty, students, technologies, and educational institutions: A holistic framework. Quarterly Review of Distance Education, 14 (1), 49–62.

Pellas, N. (2014). The influence of computer self-efficacy, metacognitive self-regulation, and self-esteem on student engagement in online learning programs: Evidence from the virtual world of Second Life. Computers in Human Behavior, 35 , 157–170. doi: 10.1016/j.chb.2014.02.048 .

Poole, S. M., Kemp, E., Williams, K. H., & Patterson, L. (2014). Get your head in the game: Using gamification in business education to connect with Generation Y. Journal for Excellence in Business Education, 3 (2), 1–9.

Poushter, J. (2016). Smartphone ownership and internet usage continues to climb in emerging economies . Washington, D.C.: Pew Research Center Retrieved from http://www.pewglobal.org/2016/02/22/smartphone-ownership-and-internet-usage-continues-to-climb-in-emerging-economies/ .

Prestridge, S. (2014). A focus on students’ use of Twitter - their interactions with each other, content and interface. Active Learning in Higher Education, 15 (2), 101–115.

Rambe, P. (2012). Activity theory and technology mediated interaction: Cognitive scaffolding using question-based consultation on “Facebook”. Australasian Journal of Educational Technology, 28 (8), 1333–1361 https://doi.org/10.14742/ajet.775 .

Reid, P. (2014). Categories for barriers to adoption of instructional technologies. Education and Information Technologies, 19 (2), 383–407.

Revere, L., & Kovach, J. V. (2011). Online technologies for engagement learning: A meaningful synthesis for educators. Quarterly Review of Distance Education, 12 (2), 113–124.

Richardson, J. C., & Newby, T. (2006). The role of students’ cognitive engagement in online learning. American Journal of Distance Education, 20 (1), 23–37 http://dx.doi.org/10.1207/s15389286ajde2001_3 .

Ross, H. M., Banow, R., & Yu, S. (2015). The use of Twitter in large lecture courses: Do the students see a benefit? Contemporary Educational Technology, 6 (2), 126–139.

Roussinos, D., & Jimoyiannis, A. (2013). Analysis of students’ participation patterns and learning presence in a wiki-based project. Educational Media International, 50 (4), 306–324 https://doi.org/10.1080/09523987.2013.863471 .

Salaber, J. (2014). Facilitating student engagement and collaboration in a large postgraduate course using wiki-based activities. International Journal of Management Education, 12 (2), 115–126. doi: 10.1016/j.ijme.2014.03.006 .

Scarlet, J., & Ampolos, L. (2013). Using game-based learning to teach psychopharmacology. Psychology Learning and Teaching, 12 (1), 64–70 https://doi.org/10.2304/plat.2013.12.1.64 .

Sharma, P., & Tietjen, P. (2016). Examining patterns of participation and meaning making in student blogs: A case study in higher education. American Journal of Distance Education, 30 (1), 2–13 http://dx.doi.org/10.1080/08923647.2016.1119605 .

Shraim, K. Y. (2014). Pedagogical innovation within Facebook: A case study in tertiary education in Palestine. International Journal of Emerging Technologies in Learning, 9 (8), 25–31. doi: 10.3991/ijet.v9i8.3805 .

Siddique, Z., Ling, C., Roberson, P., Xu, Y., & Geng, X. (2013). Facilitating higher-order learning through computer games. Journal of Mechanical Design, 135 (12), 121004–121010.

Smith, A., & Anderson, M. (2016). Online Shopping and E-Commerce . Washington, D.C.: Pew Research Center Retrieved from http://www.pewinternet.org/2016/12/19/online-shopping-and-e-commerce/ .

Staines, Z., & Lauchs, M. (2013). Students’ engagement with Facebook in a university undergraduate policing unit. Australasian Journal of Educational Technology, 29 (6), 792–805 https://doi.org/10.14742/ajet.270 .

Sun, A., & Chen, X. (2016). Online education and its effective practice: A research review. Journal of Information Technology Education: Research, 15 , 157–190.

Tiernan, P. (2014). A study of the use of Twitter by students for lecture engagement and discussion. Education and Information Technologies, 19 (4), 673–690 https://doi.org/10.1007/s10639-012-9246-4 .

Trowler, V. (2010). Student engagement literature review . Lancaster: Lancaster University Retrieved from http://www.lancaster.ac.uk/staff/trowler/StudentEngagementLiteratureReview.pdf .

Trowler, V., & Trowler, P. (2010). Student engagement evidence summary . Lancaster: Lancaster University Retrieved from http://eprints.lancs.ac.uk/61680/1/Deliverable_2._Evidence_Summary._Nov_2010.pdf .

van Beynen, K., & Swenson, C. (2016). Exploring peer-to-peer library content and engagement on a student-run Facebook group. College & Research Libraries, 77 (1), 34–50 https://doi.org/10.5860/crl.77.1.34 .

Wang, S. (2008). Blogs in education. In M. Pagani (Ed.), Encyclopedia of Multimedia Technology and Networking (2nd ed., pp. 134–139). Hershey: Information Sciences Reference.

Wdowik, S. (2014). Using a synchronous online learning environment to promote and enhance transactional engagement beyond the classroom. Campus — Wide Information Systems, 31 (4), 264–275. doi: 10.1108/CWIS-10-2013-0057 .

Weibel, D., Wissmath, B., Habegger, S., Steiner, Y., & Groner, R. (2008). Playing online games against computer-vs. human-controlled opponents: Effects on presence, flow, and enjoyment. Computers in Human Behavior, 24 (5), 2274–2291 https://doi.org/10.1016/j.chb.2007.11.002 .

West, B., Moore, H., & Barry, B. (2015). Beyond the tweet: Using Twitter to enhance engagement, learning, and success among first-year students. Journal of Marketing Education, 37 (3), 160–170. doi: 10.1177/0273475315586061 .

Westera, W. (2015). Reframing the role of educational media technologies. Quarterly Review of Distance Education, 16 (2), 19–32.

Whitton, N. (2011). Game engagement theory and adult learning. Simulation & Gaming, 42 (5), 596–609.

Williams, D., & Whiting, A. (2016). Exploring the relationship between student engagement, Twitter, and a learning management system: A study of undergraduate marketing students. International Journal of Teaching & Learning in Higher Education, 28 (3), 302–313.

Wimpenny, K., & Savin-Baden, M. (2013). Alienation, agency, and authenticity: A synthesis of the literature on student engagement. Teaching in Higher Education, 18 (3), 311–326. doi: 10.1080/13562517.2012.725223 .

Witkowski, P., & Cornell, T. (2015). An Investigation into Student Engagement in Higher Education Classrooms. InSight: A Journal of Scholarly Teaching, 10 , 56–67.

Wright, G. B. (2011). Student-centered learning in higher education. International Journal of Teaching and Learning in Higher Education, 23 (3), 92–97.

Yang, C., & Chang, Y. (2012). Assessing the effects of interactive blogging on student attitudes towards peer interaction, learning motivation, and academic achievements. Journal of Computer Assisted Learning, 28 (2), 126–135 https://doi.org/10.1111/j.1365-2729.2011.00423.x .

Zepke, N. (2014). Student engagement research in higher education: questioning an academic orthodoxy. Teaching in Higher Education, 19 (6), 697–708 http://dx.doi.org/10.1080/13562517.2014.901956 .

Zepke, N., & Leach, L. (2010). Improving student engagement: Ten proposals for action. Active Learning in Higher Education, 11 (3), 167–177. doi: 10.1177/1469787410379680 .

Zickuhr, K., & Raine, L. (2014). E-reading rises as device ownership jumps . Washington, D.C.: Pew Research Center Retrieved from http://www.pewinternet.org/2014/01/16/e-reading-rises-as-device-ownership-jumps/ .

Zimmermann, L. K. (2013). Using a virtual simulation program to teach child development. College Teaching, 61 (4), 138–142. doi: 10.1080/87567555.2013.817377 .

Download references

Acknowledgements

Not applicable.

This research was supported in part by a Laureate Education, Incl. David A. Wilson research grant study awarded to the second author, “A Comparative Analysis of Student Engagement and Critical Thinking in Two Approaches to the Online Classroom”.

Availability of data and materials

Authors’ contributions.

The first and second authors contributed significantly to the writing, review, and conceptual thinking of the manuscript. The third author provided a first detailed outline of what the paper could address, and the fourth offer provided input and feedback through critical review. All authors read and approved the final manuscript.

Competing interests

The authors declare that they have no competing interests.

Consent for publication

Ethics approval and consent to participate.

The parent study was approved by the University of Liverpool Online International Online Ethics Review Committee, approval number 04-24-2015-01.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Author information

Authors and affiliations.

University of Liverpool Online, Liverpool, UK

Laura A. Schindler & Osama A. Morad

Laureate Education, Inc., Baltimore, USA

Gary J. Burkholder

Walden University, Minneapolis, USA

University of Lincoln, Lincoln, UK

Craig Marsh

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Laura A. Schindler .

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Cite this article.

Schindler, L.A., Burkholder, G.J., Morad, O.A. et al. Computer-based technology and student engagement: a critical review of the literature. Int J Educ Technol High Educ 14 , 25 (2017). https://doi.org/10.1186/s41239-017-0063-0

Download citation

Received : 31 March 2017

Accepted : 06 June 2017

Published : 02 October 2017

DOI : https://doi.org/10.1186/s41239-017-0063-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Social networking

technology application review and demonstration assignment

  • Columbia University in the City of New York
  • Office of Teaching, Learning, and Innovation
  • University Policies
  • Columbia Online
  • Academic Calendar
  • Resources and Technology
  • Instructional Technologies
  • Teaching in All Modalities

Designing Assignments for Learning

The rapid shift to remote teaching and learning meant that many instructors reimagined their assessment practices. Whether adapting existing assignments or creatively designing new opportunities for their students to learn, instructors focused on helping students make meaning and demonstrate their learning outside of the traditional, face-to-face classroom setting. This resource distills the elements of assignment design that are important to carry forward as we continue to seek better ways of assessing learning and build on our innovative assignment designs.

On this page:

Rethinking traditional tests, quizzes, and exams.

  • Examples from the Columbia University Classroom
  • Tips for Designing Assignments for Learning

Reflect On Your Assignment Design

Connect with the ctl.

  • Resources and References

technology application review and demonstration assignment

Cite this resource: Columbia Center for Teaching and Learning (2021). Designing Assignments for Learning. Columbia University. Retrieved [today’s date] from https://ctl.columbia.edu/resources-and-technology/teaching-with-technology/teaching-online/designing-assignments/

Traditional assessments tend to reveal whether students can recognize, recall, or replicate what was learned out of context, and tend to focus on students providing correct responses (Wiggins, 1990). In contrast, authentic assignments, which are course assessments, engage students in higher order thinking, as they grapple with real or simulated challenges that help them prepare for their professional lives, and draw on the course knowledge learned and the skills acquired to create justifiable answers, performances or products (Wiggins, 1990). An authentic assessment provides opportunities for students to practice, consult resources, learn from feedback, and refine their performances and products accordingly (Wiggins 1990, 1998, 2014). 

Authentic assignments ask students to “do” the subject with an audience in mind and apply their learning in a new situation. Examples of authentic assignments include asking students to: 

  • Write for a real audience (e.g., a memo, a policy brief, letter to the editor, a grant proposal, reports, building a website) and/or publication;
  • Solve problem sets that have real world application; 
  • Design projects that address a real world problem; 
  • Engage in a community-partnered research project;
  • Create an exhibit, performance, or conference presentation ;
  • Compile and reflect on their work through a portfolio/e-portfolio.

Noteworthy elements of authentic designs are that instructors scaffold the assignment, and play an active role in preparing students for the tasks assigned, while students are intentionally asked to reflect on the process and product of their work thus building their metacognitive skills (Herrington and Oliver, 2000; Ashford-Rowe, Herrington and Brown, 2013; Frey, Schmitt, and Allen, 2012). 

It’s worth noting here that authentic assessments can initially be time consuming to design, implement, and grade. They are critiqued for being challenging to use across course contexts and for grading reliability issues (Maclellan, 2004). Despite these challenges, authentic assessments are recognized as beneficial to student learning (Svinicki, 2004) as they are learner-centered (Weimer, 2013), promote academic integrity (McLaughlin, L. and Ricevuto, 2021; Sotiriadou et al., 2019; Schroeder, 2021) and motivate students to learn (Ambrose et al., 2010). The Columbia Center for Teaching and Learning is always available to consult with faculty who are considering authentic assessment designs and to discuss challenges and affordances.   

Examples from the Columbia University Classroom 

Columbia instructors have experimented with alternative ways of assessing student learning from oral exams to technology-enhanced assignments. Below are a few examples of authentic assignments in various teaching contexts across Columbia University. 

  • E-portfolios: Statia Cook shares her experiences with an ePorfolio assignment in her co-taught Frontiers of Science course (a submission to the Voices of Hybrid and Online Teaching and Learning initiative); CUIMC use of ePortfolios ;
  • Case studies: Columbia instructors have engaged their students in authentic ways through case studies drawing on the Case Consortium at Columbia University. Read and watch a faculty spotlight to learn how Professor Mary Ann Price uses the case method to place pre-med students in real-life scenarios;
  • Simulations: students at CUIMC engage in simulations to develop their professional skills in The Mary & Michael Jaharis Simulation Center in the Vagelos College of Physicians and Surgeons and the Helene Fuld Health Trust Simulation Center in the Columbia School of Nursing; 
  • Experiential learning: instructors have drawn on New York City as a learning laboratory such as Barnard’s NYC as Lab webpage which highlights courses that engage students in NYC;
  • Design projects that address real world problems: Yevgeniy Yesilevskiy on the Engineering design projects completed using lab kits during remote learning. Watch Dr. Yesilevskiy talk about his teaching and read the Columbia News article . 
  • Writing assignments: Lia Marshall and her teaching associate Aparna Balasundaram reflect on their “non-disposable or renewable assignments” to prepare social work students for their professional lives as they write for a real audience; and Hannah Weaver spoke about a sandbox assignment used in her Core Literature Humanities course at the 2021 Celebration of Teaching and Learning Symposium . Watch Dr. Weaver share her experiences.  

​Tips for Designing Assignments for Learning

While designing an effective authentic assignment may seem like a daunting task, the following tips can be used as a starting point. See the Resources section for frameworks and tools that may be useful in this effort.  

Align the assignment with your course learning objectives 

Identify the kind of thinking that is important in your course, the knowledge students will apply, and the skills they will practice using through the assignment. What kind of thinking will students be asked to do for the assignment? What will students learn by completing this assignment? How will the assignment help students achieve the desired course learning outcomes? For more information on course learning objectives, see the CTL’s Course Design Essentials self-paced course and watch the video on Articulating Learning Objectives .  

Identify an authentic meaning-making task

For meaning-making to occur, students need to understand the relevance of the assignment to the course and beyond (Ambrose et al., 2010). To Bean (2011) a “meaning-making” or “meaning-constructing” task has two dimensions: 1) it presents students with an authentic disciplinary problem or asks students to formulate their own problems, both of which engage them in active critical thinking, and 2) the problem is placed in “a context that gives students a role or purpose, a targeted audience, and a genre.” (Bean, 2011: 97-98). 

An authentic task gives students a realistic challenge to grapple with, a role to take on that allows them to “rehearse for the complex ambiguities” of life, provides resources and supports to draw on, and requires students to justify their work and the process they used to inform their solution (Wiggins, 1990). Note that if students find an assignment interesting or relevant, they will see value in completing it. 

Consider the kind of activities in the real world that use the knowledge and skills that are the focus of your course. How is this knowledge and these skills applied to answer real-world questions to solve real-world problems? (Herrington et al., 2010: 22). What do professionals or academics in your discipline do on a regular basis? What does it mean to think like a biologist, statistician, historian, social scientist? How might your assignment ask students to draw on current events, issues, or problems that relate to the course and are of interest to them? How might your assignment tap into student motivation and engage them in the kinds of thinking they can apply to better understand the world around them? (Ambrose et al., 2010). 

Determine the evaluation criteria and create a rubric

To ensure equitable and consistent grading of assignments across students, make transparent the criteria you will use to evaluate student work. The criteria should focus on the knowledge and skills that are central to the assignment. Build on the criteria identified, create a rubric that makes explicit the expectations of deliverables and share this rubric with your students so they can use it as they work on the assignment. For more information on rubrics, see the CTL’s resource Incorporating Rubrics into Your Grading and Feedback Practices , and explore the Association of American Colleges & Universities VALUE Rubrics (Valid Assessment of Learning in Undergraduate Education). 

Build in metacognition

Ask students to reflect on what and how they learned from the assignment. Help students uncover personal relevance of the assignment, find intrinsic value in their work, and deepen their motivation by asking them to reflect on their process and their assignment deliverable. Sample prompts might include: what did you learn from this assignment? How might you draw on the knowledge and skills you used on this assignment in the future? See Ambrose et al., 2010 for more strategies that support motivation and the CTL’s resource on Metacognition ). 

Provide students with opportunities to practice

Design your assignment to be a learning experience and prepare students for success on the assignment. If students can reasonably expect to be successful on an assignment when they put in the required effort ,with the support and guidance of the instructor, they are more likely to engage in the behaviors necessary for learning (Ambrose et al., 2010). Ensure student success by actively teaching the knowledge and skills of the course (e.g., how to problem solve, how to write for a particular audience), modeling the desired thinking, and creating learning activities that build up to a graded assignment. Provide opportunities for students to practice using the knowledge and skills they will need for the assignment, whether through low-stakes in-class activities or homework activities that include opportunities to receive and incorporate formative feedback. For more information on providing feedback, see the CTL resource Feedback for Learning . 

Communicate about the assignment 

Share the purpose, task, audience, expectations, and criteria for the assignment. Students may have expectations about assessments and how they will be graded that is informed by their prior experiences completing high-stakes assessments, so be transparent. Tell your students why you are asking them to do this assignment, what skills they will be using, how it aligns with the course learning outcomes, and why it is relevant to their learning and their professional lives (i.e., how practitioners / professionals use the knowledge and skills in your course in real world contexts and for what purposes). Finally, verify that students understand what they need to do to complete the assignment. This can be done by asking students to respond to poll questions about different parts of the assignment, a “scavenger hunt” of the assignment instructions–giving students questions to answer about the assignment and having them work in small groups to answer the questions, or by having students share back what they think is expected of them.

Plan to iterate and to keep the focus on learning 

Draw on multiple sources of data to help make decisions about what changes are needed to the assignment, the assignment instructions, and/or rubric to ensure that it contributes to student learning. Explore assignment performance data. As Deandra Little reminds us: “a really good assignment, which is a really good assessment, also teaches you something or tells the instructor something. As much as it tells you what students are learning, it’s also telling you what they aren’t learning.” ( Teaching in Higher Ed podcast episode 337 ). Assignment bottlenecks–where students get stuck or struggle–can be good indicators that students need further support or opportunities to practice prior to completing an assignment. This awareness can inform teaching decisions. 

Triangulate the performance data by collecting student feedback, and noting your own reflections about what worked well and what did not. Revise the assignment instructions, rubric, and teaching practices accordingly. Consider how you might better align your assignment with your course objectives and/or provide more opportunities for students to practice using the knowledge and skills that they will rely on for the assignment. Additionally, keep in mind societal, disciplinary, and technological changes as you tweak your assignments for future use. 

Now is a great time to reflect on your practices and experiences with assignment design and think critically about your approach. Take a closer look at an existing assignment. Questions to consider include: What is this assignment meant to do? What purpose does it serve? Why do you ask students to do this assignment? How are they prepared to complete the assignment? Does the assignment assess the kind of learning that you really want? What would help students learn from this assignment? 

Using the tips in the previous section: How can the assignment be tweaked to be more authentic and meaningful to students? 

As you plan forward for post-pandemic teaching and reflect on your practices and reimagine your course design, you may find the following CTL resources helpful: Reflecting On Your Experiences with Remote Teaching , Transition to In-Person Teaching , and Course Design Support .

The Columbia Center for Teaching and Learning (CTL) is here to help!

For assistance with assignment design, rubric design, or any other teaching and learning need, please request a consultation by emailing [email protected]

Transparency in Learning and Teaching (TILT) framework for assignments. The TILT Examples and Resources page ( https://tilthighered.com/tiltexamplesandresources ) includes example assignments from across disciplines, as well as a transparent assignment template and a checklist for designing transparent assignments . Each emphasizes the importance of articulating to students the purpose of the assignment or activity, the what and how of the task, and specifying the criteria that will be used to assess students. 

Association of American Colleges & Universities (AAC&U) offers VALUE ADD (Assignment Design and Diagnostic) tools ( https://www.aacu.org/value-add-tools ) to help with the creation of clear and effective assignments that align with the desired learning outcomes and associated VALUE rubrics (Valid Assessment of Learning in Undergraduate Education). VALUE ADD encourages instructors to explicitly state assignment information such as the purpose of the assignment, what skills students will be using, how it aligns with course learning outcomes, the assignment type, the audience and context for the assignment, clear evaluation criteria, desired formatting, and expectations for completion whether individual or in a group.

Villarroel et al. (2017) propose a blueprint for building authentic assessments which includes four steps: 1) consider the workplace context, 2) design the authentic assessment; 3) learn and apply standards for judgement; and 4) give feedback. 

References 

Ambrose, S. A., Bridges, M. W., & DiPietro, M. (2010). Chapter 3: What Factors Motivate Students to Learn? In How Learning Works: Seven Research-Based Principles for Smart Teaching . Jossey-Bass. 

Ashford-Rowe, K., Herrington, J., and Brown, C. (2013). Establishing the critical elements that determine authentic assessment. Assessment & Evaluation in Higher Education. 39(2), 205-222, http://dx.doi.org/10.1080/02602938.2013.819566 .  

Bean, J.C. (2011). Engaging Ideas: The Professor’s Guide to Integrating Writing, Critical Thinking, and Active Learning in the Classroom . Second Edition. Jossey-Bass. 

Frey, B. B, Schmitt, V. L., and Allen, J. P. (2012). Defining Authentic Classroom Assessment. Practical Assessment, Research, and Evaluation. 17(2). DOI: https://doi.org/10.7275/sxbs-0829  

Herrington, J., Reeves, T. C., and Oliver, R. (2010). A Guide to Authentic e-Learning . Routledge. 

Herrington, J. and Oliver, R. (2000). An instructional design framework for authentic learning environments. Educational Technology Research and Development, 48(3), 23-48. 

Litchfield, B. C. and Dempsey, J. V. (2015). Authentic Assessment of Knowledge, Skills, and Attitudes. New Directions for Teaching and Learning. 142 (Summer 2015), 65-80. 

Maclellan, E. (2004). How convincing is alternative assessment for use in higher education. Assessment & Evaluation in Higher Education. 29(3), June 2004. DOI: 10.1080/0260293042000188267

McLaughlin, L. and Ricevuto, J. (2021). Assessments in a Virtual Environment: You Won’t Need that Lockdown Browser! Faculty Focus. June 2, 2021. 

Mueller, J. (2005). The Authentic Assessment Toolbox: Enhancing Student Learning through Online Faculty Development . MERLOT Journal of Online Learning and Teaching. 1(1). July 2005. Mueller’s Authentic Assessment Toolbox is available online. 

Schroeder, R. (2021). Vaccinate Against Cheating With Authentic Assessment . Inside Higher Ed. (February 26, 2021).  

Sotiriadou, P., Logan, D., Daly, A., and Guest, R. (2019). The role of authentic assessment to preserve academic integrity and promote skills development and employability. Studies in Higher Education. 45(111), 2132-2148. https://doi.org/10.1080/03075079.2019.1582015    

Stachowiak, B. (Host). (November 25, 2020). Authentic Assignments with Deandra Little. (Episode 337). In Teaching in Higher Ed . https://teachinginhighered.com/podcast/authentic-assignments/  

Svinicki, M. D. (2004). Authentic Assessment: Testing in Reality. New Directions for Teaching and Learning. 100 (Winter 2004): 23-29. 

Villarroel, V., Bloxham, S, Bruna, D., Bruna, C., and Herrera-Seda, C. (2017). Authentic assessment: creating a blueprint for course design. Assessment & Evaluation in Higher Education. 43(5), 840-854. https://doi.org/10.1080/02602938.2017.1412396    

Weimer, M. (2013). Learner-Centered Teaching: Five Key Changes to Practice . Second Edition. San Francisco: Jossey-Bass. 

Wiggins, G. (2014). Authenticity in assessment, (re-)defined and explained. Retrieved from https://grantwiggins.wordpress.com/2014/01/26/authenticity-in-assessment-re-defined-and-explained/

Wiggins, G. (1998). Teaching to the (Authentic) Test. Educational Leadership . April 1989. 41-47. 

Wiggins, Grant (1990). The Case for Authentic Assessment . Practical Assessment, Research & Evaluation , 2(2). 

Wondering how AI tools might play a role in your course assignments?

See the CTL’s resource “Considerations for AI Tools in the Classroom.”

This website uses cookies to identify users, improve the user experience and requires cookies to work. By continuing to use this website, you consent to Columbia University's use of cookies and similar technologies, in accordance with the Columbia University Website Cookie Notice .

  • Skip to Content
  • Skip to Main Navigation
  • Skip to Search

technology application review and demonstration assignment

IUPUI IUPUI IUPUI

Open Search

  • Center Directory
  • Hours, Location, & Contact Info
  • Plater-Moore Conference on Teaching and Learning
  • Teaching Foundations Webinar Series
  • Associate Faculty Development
  • Early Career Teaching Academy
  • Faculty Fellows Program
  • Graduate Student and Postdoc Teaching Development
  • Awardees' Expectations
  • Request for Proposals
  • Proposal Writing Guidelines
  • Support Letter
  • Proposal Review Process and Criteria
  • Support for Developing a Proposal
  • Download the Budget Worksheet
  • CEG Travel Grant
  • Albright and Stewart
  • Bayliss and Fuchs
  • Glassburn and Starnino
  • Rush Hovde and Stella
  • Mithun and Sankaranarayanan
  • Hollender, Berlin, and Weaver
  • Rose and Sorge
  • Dawkins, Morrow, Cooper, Wilcox, and Rebman
  • Wilkerson and Funk
  • Vaughan and Pierce
  • CEG Scholars
  • Broxton Bird
  • Jessica Byram
  • Angela and Neetha
  • Travis and Mathew
  • Kelly, Ron, and Jill
  • Allison, David, Angela, Priya, and Kelton
  • Pamela And Laura
  • Tanner, Sally, and Jian Ye
  • Mythily and Twyla
  • Learning Environments Grant
  • Extended Reality Initiative(XRI)
  • Champion for Teaching Excellence Award
  • Feedback on Teaching
  • Consultations
  • Equipment Loans
  • Quality Matters@IU
  • To Your Door Workshops
  • Support for DEI in Teaching
  • IU Teaching Resources
  • Just-In-Time Course Design
  • Teaching Online
  • Scholarly Teaching Taxonomy
  • The Forum Network
  • Media Production Spaces
  • CTL Happenings Archive
  • Recommended Readings Archive

Center for Teaching and Learning

  • Documenting Your Teaching

Planning a Teaching Demonstration

Teaching demonstrations are artificial—the students aren’t yours, you won’t see them again for follow-up lessons, you might even be “teaching” faculty, etc.—but they are also a critical part of an academic job interview. Candidates for academic positions need to show more than just knowledge of their content area in their teaching demonstration; they need to show that they have pedagogical content knowledge. That is, the ability to select, structure, and deliver complicated content so that students can learn it. Your teaching demonstration must prove not only that you can create and follow a lesson plan, but also that you can engage and interact with students to enhance their learning.

You’ll want your teaching demonstration to reinforce whatever you’ve said about your teaching in your application materials. For example, if you have said that you create student-centered classrooms and provide students opportunities to actively learn, don't lecture for the entire time during your demonstration. 

A successful teaching demonstration ultimately comes down to careful planning and practice. If you showcase your best teaching during your demo, you’ll go a long way toward convincing the committee that you can handle the challenges of teaching day-to-day. The guidelines and tips below will get you started.  

A. Know your Audience

  • Will you be teaching a class of actual students, a group of faculty, the hiring committee, or some combination of these three groups?
  • What level of student should you be preparing for? (E.g., Majors, non-majors, graduate, etc.)
  • If you are teaching a class of actual students, ask for a copy of the course syllabus and any relevant assignments. Read the course description and objectives, and review a copy of the textbook. It might also help to get to know the students in general by looking at the university’s website and, if possible, by visiting campus and chatting directly with students. More realistically, you might attend a class at your current institution on the topic you are going to be teaching and then talk with the instructor, the TAs, and the students.
  • If you are teaching to faculty members posing as students, be sure to indicate for them the level and background of the students for whom your lesson would be intended, then pretend that the faculty members are those students and teach at the correct level. Expect, however, that faculty might ask questions at a higher level than would actual students and don’t go overboard with pretending that they are students (e.g., don’t confiscate a cell phone if one of them can’t stop looking at it!).  

B. Make your Material fit the Course and the Time

  • If you’re given a topic to teach in an actual course, find out where that topic fits into the course itself. What have the students learned beforehand? What will they be learning afterward? What assignments will they be working on? What textbook are the students using? Get a copy and read the relevant sections.
  • If you’ve been given a broad topic area from which to select a particular lesson, choose something that you can manage in the time given.
  • If you’re teaching for a full class period, aim to end no more than 5 to 10 minutes early for questions. Have a back-up plan in case for any reason you end earlier.
  • If you’re only teaching a short lesson in 10 to 15 minutes, choose a topic or lesson that will stand on its own in that time. Don’t squeeze a 50-minute lecture into 15 minutes.
  • Plan enough time for any activities you’ll include; they can sometimes run long if not properly planned and managed.

C. Engage your Students

  • Remember, this is your teaching demo, not your research talk. Don’t just lecture to the students; show that you can do something more by engaging them with active learning. Get the students interested, involved, and interacting positively with you and with one another—they might be evaluating you for the committee.
  • Use brief, meaningful activities that last no more than 5-7 minutes each. If you’re teaching faculty members, don’t expect them to be any more interested in participating in activities than are students. Create a handout, ask questions.
  • Start with a relevant hook to grab students’ interest (an alarming statistic, a current event, a thought-provoking question, etc.)
  • If you’re teaching a small group of students, bring index cards and black sharpies. Have the students write their names on the cards and set them up on their desks. Doing so creates an instant connection with students by allowing you to address them by name as you would in a class of your own.

D. Use Technology Purposefully and Effectively

  • If you plan to use technology, be sure that it serves some clear and relevant pedagogical aim; don’t use it just to impress the committee or to show off your techy side. Technology shouldn’t overwhelm the topic you’re teaching, and the contribution that technology makes to student learning should be obvious and significant. Handouts are often a better alternative to technology, since they provide everyone with a concrete takeaway by which to remember you and your demonstration.
  • Use visuals only to support your teaching and promote learning. For example, PowerPoint slides should be used sparingly and should ideally include questions or problems to which students can respond. Remember, PowerPoint should support your teaching, it shouldn’t be your centerpiece. If you do use a PowerPoint, be sure to tell students that you’ll write on the board any key information that you would like them to put in their own notes, otherwise students might try to write down everything you have on your slides.
  • In terms of PowerPoint design, use pictures, colors, and animations, but do so carefully, and don’t put too much text on any single slide. Choose a light background and dark text, and make sure that the slides are visible in a well-lit room. (You shouldn’t plan to use PowerPoint slides in a darkened room unless you want to put students to sleep.)
  • Use the board only if your handwriting is good. When writing on the board, don’t speak to it. That is, face the students and say whatever you want them to hear, and then turn and write it on the board. Doing so maintains your connection with the students and gives them an opportunity to copy down what you write.
  • Plan for technology to break down. Have an alternative plan.

E. Have a Backup Plan. Have Another.

  • Create your ideal lesson plan, a contingency plan in case you run out of time, a contingency plan in case you finish early and have too much time remaining, a contingency plan in case students simply don’t respond or if things are otherwise not working out as intended. Plan for technology to fail and know what you’ll do if and when it does.
  • Plan more material than you can possible use, and make decisions in the moment about what to leave out. Don’t indicate to your students, however, that you’re cutting something out due to time constraints.  

F. Practice. Practice. Practice.

  • Whether you’ve taught before or not, you can ask colleagues or mentors for the opportunity to lead a session in their classes. Ask them to observe your session and provide feedback. Alternately, gather some colleagues, perhaps from different disciplines, to serve as a group of students whom you can teach. Have them ask you questions just like actual students would. After the lesson, have your colleagues comment on your flow, on the way your topics connected with each other, on your body language and any verbal or physical tics you might have, and, of course, on how you might improve your overall performance.
  • If you have taught before, review any observation reports you may have from colleagues or mentors, as well as evaluation feedback from students. Consider what has worked well and what hasn’t. What improvements can you realistically make and practice before your demonstration?

Helpful Tips and Hints

  • Aim to be relaxed and confident in your demonstration, but also plan to show your enthusiasm and passion for the topic.
  • Remember that you want your demonstration to be accessible to the intended audience, as well as factually or procedurally accurate and also clearly effective in terms of student learning.
  • Show respect for students and that you like working with them. Acknowledge their contributions and thank them for participating.
  • Don’t let talkative students sidetrack you. Indicate that you are glad they are interested, but that you need to continue the class. Ask them to stay after to discuss the material with you.
  • Consider providing students and the committee with suggested follow-up assignments or next steps to show that you are aware of that teaching is a continuum, not a once-off intervention.
  • If you use graphs or other data visualizations, don’t let them speak for themselves. Instead, get students to respond to these visualizations. For example, orient students to a graph by briefly explaining what it shows, then pose questions about the graph and ask students to interpret it in some way to get students involved.
  • Push yourself to demonstrate your best teaching, but don’t try a technique or technology with which you’re not yet completely comfortable.

Questions to Consider as You Begin Planning your Teaching Demonstration

Don’t be afraid to ask the committee for details and clarification about your demonstration. At the same time, you need to ask yourself a number of important questions as you get started. The list below should help get you going.

Ask the committee:

  • How much time will I have? A whole class period or only 10 to 20 minutes?
  • Whom will I be teaching? Actual students or faculty posing as students?
  • At what level should my teaching be aimed? Majors? Non-majors? Graduate?
  • Will a topic and/or materials be provided, or should I select a topic and/or materials on my own?
  • If a topic is provided and if I’m teaching in an actual course, how does the topic fit into the course in relation to other topics? Can I get a copy of the syllabus? What textbook do the students use? Have students been given any homework? If so, what? Can I get a copy of the assignment materials?
  • Where will I be teaching? What sorts of technology or other resources are available?
  • How and by whom will my teaching be evaluated? If I’m teaching actual students, will they provide any feedback to the committee?

Ask yourself:

  • Exactly what information, and how much of it, do I want to convey in the time I have?
  • What approach is most appropriate for the topic, the students, and the institution itself? Will I mostly lecture or will I involve students in a discussion or an activity?
  • Do I want to use technology? If so, what will that technology add to my demo in terms of helping students learn? Am I comfortable using the technology that is available?

Reference and Resources

  • Smith, M. K., Wenderoth, M. P., and Tyler, M. (2013). The teaching demonstration: What faculty expect and how to prepare for this aspect of the job interview. CBE Life Sciences Education , 12(1), 12–18. http://doi.org/10.1187/cbe.12-09-0161 http://doi.org/10.1187/cbe.12-09-0161
  • CTL's teaching demonstration rubric - the CTL uses this teaching demonstration rubric to evaluate and provide feedback on teaching demonstrations given by graduate students in the Emerging Scholars of College Instruction Program. 

The CTL offers a workshop on preparing and delivering an effective teaching demonstration. Watch a recording of the spring 2020 webinar facilitated by Dr. Debbie Herold where she provides several strategies and guidelines for a successful teaching demonstration. Panelists Dr. Katherine Adams and graduate student Matt Walsh share their experiences of having given teaching demonstrations recently with tips to prepare for and navigate the unexpected challenges that may arise during the demonstration!

For more information about teaching demonstrations, contact Jessica Alexander ( [email protected] ) or Anusha S. Rao ( [email protected] ). 

Updated by Anusha S. Rao, April 20, 2020 Created by James Gregory, November, 2016

Center for Teaching and Learning resources and social media channels

NACADA

Academic Advising Resources

Using technology for evaluation and assessment.

Using Technology for Intentional Student Evaluation and Program Assessment Authored by: George E. Steele 2015

If Aristotle is correct, then we in academic advising can always use assistance to be as intentional as possible in our practice. Advising as teaching is a paradigm that has been advocated by many authors (Lowenstein, 2005; Hemwall & Trachte, 1999, 2003), but intentionally identifying what students need to learn is critical. As Martin (2007) stated, “ Learning objectives answer the question: what should students learn through academic advising?” Likewise, Steele (2014) argued for the intentional use of technologies as tools.

Tools are designed for specific uses. The best use of technologies is when their capabilities align with our advising goals. This article will use elements of Steele ’ s model for the intentional use of technology and combine these with elements of the curriculum development model called  Understanding by Design t o help advisors achieve better student learning outcomes and improve program assessment. Integrated, these two models offer a conceptual way to reconsider how to organize learning outcomes and program assessment.

Curriculum Development Stages Useful to Assessment Intentionality is a critical component in the curriculum development model  Understanding by Design  (UbD) created by McTighe and Wiggins (2005, 2011).  A synopsis of their UbD model contains three main stages:

Stage 1:  Identify desired results by identifying what students should know, understand, and be able to do.

Stage 2:  Determine assessment evidence by identifying how we will know if students have achieved the desired results.

Stage 3:  Plan learning experiences and instruction that address the transfer of learning, meaning-making, and acquisition.

McTighe and Wiggins ’ s primarily focused has been on K–12 education. However, their model has much to offer college educators. For this article, outlined are the advantages their model offers to academic advisors seeking to develop a teaching and learning approach.

Understanding by Design—the Model

Stage 1: Identifying Desired Results

While the UbD model offers more in-depth considerations and guidelines than what can be presented here, there are several key points of the UbD model that are helpful for our purpose. In its crucial first step, the UbD model offers insights about identifying desired results by identifying what students  should know, understand,  and  be able to do . The model suggests that learning outcomes be evaluated in the context of what should be the  key understandings  of the outcome and what are the  essential questions  related to specific outcomes. For our purpose, as an example, an essential academic advising goal is engaging in self-assessment to develop an academic and career plan. An example of understandings and fundamental questions related to this goal is suggested in the graph below.

By determining what the essential learning outcomes are, versus those not essential or "nice" to know - we are forced to make value decisions about what we believe is critical for our students to learn. For example, what is a more critical learning outcome? Having a student engage in a successful educational and career planning project, or showing up to a scheduled meeting on time?  By determining the essential academic advising learning outcomes in this fashion, we can also stress the importance of better aligning them to our institutional mission and working with the collaborative campus community to achieve them (Campbell, 2008; Robbins & Zarges, 2011).

Stage 2: Determining Assessment Evidence  

The second stage, McTighe, and Wiggins, identify is determining assessment evidence. For our sake, let ’ s call this student learning evaluation to better align it with the nomenclature found within the field of academic advising. At this stage,  it is also important to consider which technology tools can best assist with capturing evaluation evidence of student learning.  This decision should be guided by the levels of learning we seek.

McTighe and Wiggins (2012) present their own taxonomy of terms to describe the understanding of the different levels of learning. In order of complexity, it begins at the simple stage of explaining and moves through levels of interpreting, apply, demonstrate perspective, display empathy, and finally to have self-knowledge. For them, this last level is a meta-cognitive level of awareness where one uses productive habits of mind, developed through prior learning, to reflect on the meaning of current learning and experiences. Those familiar with Bloom ’ s Taxonomies (Bloom, 1953; Anderson et al., 2001) can see parallels with McTighe and Wiggins ’ s taxonomy in terms of defining learning outcomes. Both taxonomies move from the simple to more complex functioning; while McTighe and Wiggins present one classification taxonomy, Bloom and his colleagues defined three taxonomy domains: cognitive, affective, and behavioral.

Stage 3: Designing and Planning Activities  

Stage three of McTighe and Wiggins ’ s approach is to plan learning experiences and instruction that address the transfer of learning, meaning-making, and acquisition. At this step, advisors can consider what activities might help achieve desired learning outcomes. Consider the example in step one: activities for self-assessment and exploration. This step might include the following activities:

  • online self-assessment tools,
  • students and faculty in specific academic programs informational interviewing resources,
  • lists of career options related to particular academic majors, and
  • students' reflective writing assignments focused on their academic and career goals.

A quick scan of these suggested activities shows they include both virtual and traditional experiences. The advantages are maintained as long as the means of evaluation relies on the use of digital technologies. By doing so, advisors can capitalize on the key to McTighe and Wiggins ’ s (2012) approach that “ encourages teachers and curriculum developers to first think like assessors before designing specific units and lessons” (p. 7).

Applying Steele ’ s Intentional Use of Technology The  intentional use of technology model  identifies three categories of technologies used in advising: service, engagement, and learning (Steele, 2014). The service category highlights those tools that provide institutional services through personalized student accounts such as student information systems and degree audits. The engagement category identifies tools to inform and build communities with students and others at the institution, such as social media, static websites, and many electronic communication tools. The learning category includes tools such as e-Portfolios, learning management systems, interactive video conferencing, and early alert systems. A key element in this category is including a means to digitally evaluate student learning through mastery of content, skills developed, a project produced, plan submitted, or demonstration of reflection on a topic or issue. While it seems self-evident that tools in the learning area can assist with achieving learning outcomes, the integration of tools found in all three areas can lead to the creation of a more robust and enriched digital learning environment. Such a combination of tools can help advisors address student learning and program assessment in a robust manner.

Learning Taxonomies and Student Tools for Evaluation Regardless of the taxonomy used, the central point is that different types of evaluation tools are better at evaluating different levels of student learning. Using the general learning goal presented earlier, the chart below suggests relationships between 1) assessment evidence, 2) types of technologies identified in Steele ’ s model under learning, and 3) specific digital assessment tools used in specific technologies.

The progression illustrated in this table shows that technology tools for evaluating students can be differentiated in terms of their effectiveness at different levels of student learning. Technology tools should be aligned with student-learning outcomes beginning with simple tools found in most Learning Management Systems (e.g., quizzes, fill-in-the-blank, matching, and short answer responses). Evaluation tools at this level tend to focus on the lower levels of learning taxonomies: information recall, comparing, or explaining. More complex learning outcomes require more sophisticated technology tools, including tools that allow for writing papers demonstrating careful reflection on the exploration process or submitting an academic and career exploration project in an e-Portfolio tool. These more sophisticated efforts often rely on the use of rubrics for evaluating student learning example .

Evaluation tools at this level tend to focus on the higher levels of the learning taxonomies that seek demonstrations of meta-cognitive levels of awareness.

Such approaches to evaluate students are familiar to academic advisors in traditional learning settings. When advisors use appropriate technologies to help assess students learning, two critical advantages are evident. Firstly, content modules related to specific learning outcomes are aligned to evaluation tools through an LMS or e-Portfolio. Secondly, the results of student learning are digitally created in relation to specific learning outcomes.

Integrating Technologies for Improving Understanding Technologies found in the learning component of Steele ’ s model are not the only technologies that can advance learning. For instance, degree audits are categorized under “ services.” We can ask students to produce a degree audit as one measure of a student learning outcome. More importantly, we can have students interpret what the degree audit means to them through a reflection paper submitted in the LMS or e-Portfolio.

Many of the technologies listed in the social category exist outside the institution ’ s student portal, what Steele referred to as crossing the “ FERPA Line” where institutional security does not exist. Thus, advisors must use these technologies with caution. Still, we can suggest students use services like a LinkedIn site to connect with alumni from the institution who participate because they want to share their career experiences with currently enrolled students. Students could consider their interactions with these alumni in the context of their e-Portfolio project or a reflection paper, which would then be submitted through an LMS. Students who engage in these types of activities demonstrate higher-order thinking through reflection.

Implications for Using Technology in Program Assessment The use of models from both McTighe and Wiggins and Steele can help advisors with program assessment in significant ways. One advantage is that identification of outcomes for advising can be classified as process/delivery outcomes (PDOs) for the advising program and student learning outcomes (SLOs) (Campbell, 2008; Robins & Zarges, 2011). Creating SLOs is necessary for stage one of the UbD model. When technology is used to evaluate student learning outcomes, students ’ results are in a digital format that can be easily aggregated. Thus, advisors can more efficiently determine how well a program ’ s activities helped students achieve desired SLOs.

Similarly, another benefit is that because an integrated approach encourages advising programs to use technology for content and service delivery, students are using technologies in different ways. When students use technologies located within the institution ’ s student portal (service and learning technologies), it becomes easier to track student use. When students use a degree audit report to help explain their choices for educational and career planning, their use of the degree audit produces PDOs. If they write a paper describing why they chose a particular program, the document can be an activity associated with an SLO. Rich sources for program assessment are created by aggregating both SLO and PDO data in this manner.

Curriculum planning and the intentional use of technology help advisors evaluate essential learning and prove how advisors increase student learning while aiding program assessment.

George E. Steele Retired, The Ohio State University [email protected]

Anderson, L.W., Krathwohl, D.R., Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich, P.R., Raths, J., & Wittrock, M.C. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's Taxonomy of Educational Objectives . New York, NY: Pearson, Allyn & Bacon.

Bloom, B.S. (1956). Taxonomy of educational objectives, handbook I : The cognitive domain. New York, NY: David McKay Co Inc.

Campbell, S. M. (2008). Vision, mission, goals, and program objectives for academic advising programs. In Gordon, V.N., Habley, W.R. & Grites, T.J. (Eds.). Academic advising: A comprehensive handbook (second edition) (229-241) . San Francisco, CA: Jossey-Bass.

Hemwall, M. K., & Trachte, K. C. (1999). Learning at the core: Toward a new understanding of academic advising. NACADA Journal, 19 (1), 5–11.

Hemwall, M. K., & Trachte, K. C. (2003). Academic advising and a learning paradigm. In M.K. Hemwall & K. C. Trachte (Eds.), Advising and learning: Academic advising from the perspective of small colleges and universities (NACADA Monograph No. 8, pp. 13–20). Manhattan, KS: National Academic Advising Association.

Lowenstein, M. (2005). If advising is teaching, what do advsiors teach? NACADA Journal, 25 (2), 65-73.

Martin, H. (2007). Constructing Learning Objectives for Academic Advising. Retrieved from NACADA Clearinghouse of Academic Advising Resources. Website: http://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Constructing-student-learning-outcomes.aspx

McTighe, J., & Wiggins, G. (1999). Understanding by design professional development workbook . Alexandria, VA: ASCD. Website: http://shop.ascd.org/ProductDetail.aspx?ProductId=411

Robbins, R., & Zarges, K.M. (2011). Assessment of academic advising: A summary of the process. Retrieved from NACADA Clearinghouse of Academic Advising Resources website: http://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Assessment-of-academic-advising.aspx

Steele, G. (2014). Intentional use of technology for academic advising  NACADA Clearinghouse Resource  website: http://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Intentional-use-of-technology-for-academic-advising.aspx

Wiggins, G., & McTighe, J. (2005). Understanding by design (expanded 2nd edition). Alexandria, VA: ASCD.

Wiggins, G., & McTighe, J. (2011). The understanding by design guide to creating high-quality units . Alexandria, VA: ASCD

Cite this using APA style as:

Steele, G. (2015). Using Technology for Intentional Student Evaluation and Program Assessment. Retrieved -insert today's date- from NACADA Clearinghouse of Academic Advising Resources website: http://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Using-Technology-for-Evaluation-and-Assessment.aspx

Related Articles

BROWSE: Index of Topics Advising Resources

Do you have questions?  Do you need help with an advising topic?  Email us .

  • Skip to Content
  • Skip to Main Navigation
  • Skip to Search

technology application review and demonstration assignment

Indiana University Bloomington Indiana University Bloomington IU Bloomington

Open Search

  • Course Development Institute
  • Programmatic Assessment
  • Instructional Technology
  • Class Observations and Feedback
  • Online Course Review and Feedback
  • New Faculty Programs
  • History of SoTL
  • SOTL Resources
  • IUB Database
  • Featured SoTL Activity
  • Intensive Writing
  • Faculty Liaison
  • Incorporating and Grading Writing
  • Writing Tutorial Services
  • Cel Conference
  • CEL Course Development Institute
  • ACE Program
  • Community Partners
  • CEL Course Designation
  • CEL during COVID-19
  • Annual AI Orientation
  • Annual Classroom Climate Workshop
  • GTAP Awardees
  • Graduate Student Learning Communities
  • Pedagogy Courses for Credit
  • Diversity Statements
  • Learning Communities
  • Active Learning
  • Frequent and Targeted Feedback
  • Spaced Practice
  • Transparency in Learning and Teaching
  • Faculty Spotlights
  • Preparing to Teach
  • Decoding the Disciplines
  • Backward Course Design
  • Developing Learning Outcomes
  • Syllabus Construction
  • How to Productively Address AI-Generated Text in Your Classroom 
  • Accurate Attendance & Participation with Tophat
  • Designing Assignments to Encourage Integrity
  • Engaging Students with Mental Health Issues
  • Inclusive and Equitable Syllabi
  • Creating Accessible Classrooms
  • Proctoring and Equity
  • Equitable Assignment Design
  • Making Teaching Transparent
  • DEIJ Institute
  • Sense of Belonging
  • Trauma-Informed Teaching
  • Managing Difficult Classroom Discussions
  • Technology to Support Equitable and Inclusive Teaching
  • Teaching during a Crisis
  • Teaching for Equity
  • Supporting Religious Observances
  • DEIJ Resources
  • Test Construction
  • Summative and Formative Assessment
  • Classroom Assessment Techniques
  • Authentic Assessment
  • Alternatives to Traditional Exams and Papers
  • Assessment for General Education and Programmatic Review
  • Rubric Creation and Use
  • Google Suite
  • Third Party Services: Legal, Privacy, and Instructional Concerns
  • eTexts and Unizin Engage
  • Next@IU Pilot Projects
  • Web Conferencing
  • Student Response Systems
  • Mid-Semester Evaluations
  • Teaching Statements & Philosophies
  • Peer Review of Teaching
  • Teaching Portfolios
  • Administering and Interpreting Course Evaluations
  • Temporary Online Teaching
  • Attendance Policies and Student Engagement
  • Teaching in the Face of Tragedy
  • Application for an Active Learning Classroom
  • Cedar Hall Classrooms
  • Reflection in Service Learning
  • Discussions
  • Incorporating Writing
  • Team-Based Learning
  • First Day Strategies
  • Flipping the Class
  • Holding Students Accountable
  • Producing Video for Courses
  • Effective Classroom Management
  • Games for Learning
  • Quick Guides
  • Mosaic Initiative
  • Kelley Office of Instructional Consulting and Assessment

Center for Innovative Teaching and Learning

  • Associate Instructor Support
  • Internal and External Resources

Teaching Demonstrations

Teaching demonstrations are an integral part of the interview process for academic jobs where teaching responsibilities are a part of the workload. Even if you are not on the job market, teaching demonstrations can be a helpful exercise in analyzing your instructional practices and identifying areas for improvement. Many four-year institutions require a teaching demonstration in addition to a research/job talk while two-year or teaching-oriented institutions may only require a teaching demonstration (Gannon, 2019). However, the structure, format, and length of teaching demonstrations differ between four-year and two-year institutions. Furthermore, depending on the discipline, teaching demonstrations may or may not be required; but, the majority of institutions seem to require them. For example, in a survey of 113 biology faculty from a variety of institutions across North America, 62% of biology departments require a teaching demonstration (Smith et al, 2013). The rest of this article will explain what a teaching demonstration is and how to prepare for one.

What are Teaching Demonstrations?

An empty classroom. Shutterstock.com credit: hotsum

Classroom Audience

The audience of teaching demonstrations are often undergraduate students from the institution, departmental faculty, and the hiring committee. Sometimes, the department asks the job candidates to play the role of a guest lecturer in an existing class with the hiring committee and other faculty observing in the back. Other times, the department will compile a blend of students, faculty, and the hiring committee (or sometimes just the hiring committee themselves) to comprise your audience. Regardless, the department is looking for a sneak peek into what your future classroom might be like if you were hired. 

Consider the teaching demonstration as a chance to show your ability in getting others to actively learn, which is a unique part of the interview process. During research/job talks, candidates share their research with the departmental faculty and the hiring committee, who are typically other experts in the field. As such, discipline-specific jargon and commonly-used abbreviations in the field are welcome during these talks. On the other side, teaching demonstrations are meant to show how well a candidate can teach those discipline-specific concepts and skills to students who are novice-level thinkers in the discipline. In a survey of 113 biology faculty about teaching demonstrations, Smith et al. (2013) found that the candidate's ability to create a "presentation understandable to students" and ability to "pitch the talk at the correct level at for the intended audience" are two of the top 5 most important attributes of a successful teaching demonstration. Prepare your teaching demonstration as if the whole classroom, faculty and hiring committee included, are a mix of undergraduate students who may be in the major. 

Questions to consider: 

  • Who is in your audience? Is it a pre-existing class, just the hiring committee, or a self-selected mix of undergraduate students, a group of faculty, and the hiring committee?
  • How large is your audience?
  • Be mindful of where your lesson may reside in the course calendar. Is your audience of undergraduates realistically able to engage in the material if your lesson is in the first half or second half of the semester?
  • Which parts of your lesson are appropriately building rapport with your audience and sparking their interest in the material?

Research your Institution

Graphic of a presentation given to a class. Stockphoto.com credit: z_wei

Questions to consider:

  • How will your teaching demonstration be mindful and inclusive of a diverse student body?
  • Does the institution/department take pride in certain teaching styles? Does your demonstration fit?

Classroom Space

The physical space of the classroom can affect the learning dynamics of your audience, as well as what teaching strategies you choose to implement. For example, a large pillar in the middle of the room may disrupt your plans for free-flowing discussion. Similarly, a demonstration with awesome instructional technology may not launch if the classroom has limited Internet or technological tools. As Gannon (2019) explains, the more you know about the room where you'll be demonstrating your pedagogy, the better. Here are some factors to consider about the classroom set-up:

  • Do your plans to use instructional technology complement your teaching demo?
  • Can the audience rearrange seats, desk, tables to suit your lesson?
  • Is there a chalkboard, or white board? How many? 
  • How many projectors and projector screens are there?
  • If you have a presentation/video/digital media, bring it on a flash drive, just in case.

Forming your Lesson

Depending on the type of institution or on departmental preferences, job candidates may be asked to either design a lesson from scratch or be given a prescribed lesson. Two-year institutions tend to be more teaching-focused rather than research-focused (Smith et al, 2013). Hiring committees from two-year institutions may give a prescribed lesson or instruct that job candidates cover a certain chapter in the textbook (Gannon, 2019). Four-year institutions can be both research- and teaching-heavy. The instructions and contexts for teaching demonstrations may vary more here at the preference of the individual departments. In general, job candidates should follow instructions from the department. If told to deliver a lesson to an introductory class in your department, make sure the content and activities are appropriate for students at that level. 

Typically, the teaching demonstration is not judged on the amount of content, but in how well a thoughtful selection of content is taught (Jenkins, 2017). Helping the audience to realize the importance and relevance of your topic may be more useful. The easiest way is to relate the class material to something your audience may be familiar with. However, be careful of references to popular culture because your class will contain a diversity of individuals. Here are few guiding questions as you form your lesson:

  • What guidance, context, or materials was given to shape the teaching demonstration? 
  • Are the lessons, activities, and content thoughtfully selected to suit the level of students in your teaching demonstration?
  • When explaining your content, do you have examples or analogies that do not rely on exclusive popular culture? 

Practicing your Lesson

A person giving a presentation to a class in stadium seating. Pexels.com credit: ICSA

Smith, M. K., Wenderoth, M. P., and Tyler, M. (2013). The teaching demonstration: What faculty expect and how to prepare for this aspect of the job interview. CBE Life Sciences Education , 12(1), 12–18. Retrieved from http://doi.org/10.1187/cbe.12-09-0161 

Jenkins, R. (2017). The Teaching Demo: less power, more point – The Chronicle of Higher Education. Retrieved from https://www.chronicle.com/article/The-Teaching-Demo-Less-Power/241893

Gannon, K. (2019). How to succeed at a teaching demo | Chronicle Vitae. Retrieved from https://chroniclevitae.com/news/2161-how-to-succeed-at-a-teaching-demo

Center for Innovative Teaching and Learning social media channels

Useful indiana university information.

  • Campus Policies
  • Knowledge Base
  • University Information Technology Services
  • Office of the Vice Provost for Undergraduate Education
  • Office of the Vice Provost for Faculty and Academic Affairs
  • Faculty Academy on Excellence in Teaching
  • Wells Library, 2nd Floor, East Tower 1320 East Tenth Street Bloomington, IN 47405
  • Phone: 812-855-9023

IMAGES

  1. (PDF) Nanotechnology and it’s therapeutic application

    technology application review and demonstration assignment

  2. Introducing The New Application Review Process with Dwolla

    technology application review and demonstration assignment

  3. AU NDIS General Assistive Technology Assessment Template 2016

    technology application review and demonstration assignment

  4. Grant Application Review

    technology application review and demonstration assignment

  5. What is Digital Twin Technology and How It Benefits Manufacturing in

    technology application review and demonstration assignment

  6. Application Review stage

    technology application review and demonstration assignment

VIDEO

  1. EDUC 630 Technology Lesson Demonstration Assignment

  2. IIT से बेहतर B.Tech College

  3. Technology Application and Review Demonstration Assignment

  4. Technology Lesson Demonstration Assignment_Bravo

  5. Week 18

  6. DAA unit 1: Design and Analysis Process of an Algorithm (Part 1) B.Tech JNTU R18 syllabus in telugu

COMMENTS

  1. EDUC 629 :

    IMAGINARY SCHOOL DISTRICT TIP School Year 2017-2018 Ina S. Rios EDUC 629 NEW JERSEY CCCS FOR TECHNOLOGY 8.1 Educational Technology All students will use digital tools to access, manage, evaluate, and synthesize information in order to solve problems indiv. EDUC 629. Liberty University.

  2. Technology Application Assignment.docx

    View Technology Application Assignment.docx from EDU 201 at American InterContinental University. EDUC 201 Student Name Subject Topic ISTE ... Kahoot would be my assessment activity of choice as it can help prepare students with review in a fun, ... Technology Tool #2 Kahoot would be my tool of choice for helping students retain the information ...

  3. PDF Technology Applications EC-12 and Computer Science 8-12 Standards

    Standard VII. All teachers know how to plan, organize, deliver, and evaluate instruction for all students that incorporates the effective use of current technology for teaching and integrating the Technology Applications Texas Essential Knowledge and Skills (TEKS) into the curriculum. Standard VIII. The computer science teacher has the ...

  4. Technology Application and Review Demonstration Assignment

    About Press Copyright Contact us Creators Advertise Developers Terms Privacy Policy & Safety How YouTube works Test new features NFL Sunday Ticket Press Copyright ...

  5. Education 629 Technology Application Review and Demonstration

    English document from Liberty University, 2 pages, EDUC 629 TECHNOLOGY APPLICATION REVIEW AND DEMONSTRATION ASSIGNMENT INSTRUCTIONS OVERVIEW After considering our learning community needs, we must select appropriate technologies that align those needs to specific learning goals (CAST, 2022). As educators,

  6. PDF Technology Demonstration Assignment Description

    Technology Demonstration Assignment Description Due: Date listed on syllabus Purpose: Educators utilize technology on a daily basis. In order to develop the knowledge and skills necessary to effectively implement technology in the classroom, teacher candidates must gain experience researching, evaluating, and implementing technology effectively.

  7. Technology and Diversity

    Technology Application Review and Demonstration Assignment The candidate will create a 5-10 slide presentation that focuses on a critique of three current applications/programs available for ...

  8. Understanding the role of digital technologies in education: A review

    5. Digital technologies in education have given rise to various Learning management systems (LMS). These LSMs have promoted virtual classrooms where a teacher can interact with students in real-time, share his resources, deliver his lecture, assess students' learning, collect feedback, and reply to their queries. 6.

  9. PDF Technology Evaluation and Review: An individual assignment for ...

    Table 1. Timeline of the technology evaluation and review assignment. Week Project Progress Technology Evaluation and Review Assignment 1 2 2.tech0 3 Teams formed; projects assigned to teams. 2.tech1 4 5 Requirement due 2.tech2 6 Analysis and design 2.tech3 update 7 Design due 2.tech3 update 8 Implementation 2.tech3 update

  10. Computer-based technology and student engagement: a critical review of

    Computer-based technology has infiltrated many aspects of life and industry, yet there is little understanding of how it can be used to promote student engagement, a concept receiving strong attention in higher education due to its association with a number of positive academic outcomes. The purpose of this article is to present a critical review of the literature from the past 5 years related ...

  11. (PDF) Development of a Methodology for Technology Demonstration

    demonstration, and finally full-scale system design and development ( Fevolden et at., 2017). Financial decisions are usually made correspondingly in a stepwise manner. Decision on whether to ...

  12. Designing Assignments for Learning

    An authentic assessment provides opportunities for students to practice, consult resources, learn from feedback, and refine their performances and products accordingly (Wiggins 1990, 1998, 2014). Authentic assignments ask students to "do" the subject with an audience in mind and apply their learning in a new situation.

  13. EDUC201 Techology Application Project Instructions 1 .docx

    Using a different technology resource than the learning activity, create the assessment activity that covers your topic and subject! 2. Have fun! Enjoy using the technology! 3. Copy the link to the resource for your Technology Application Project Chart. 4. Take a screenshot of the resource for your Technology Application Project Chart.

  14. PDF Development of a Methodology for Technology Demonstration Projects

    This paper presents R2L framework based on three major criteria, defined in detail: Leap Potential, Learning, and Risk. The framework was applied to a real flight-test demonstrator project during workshops in a major aerospace company. Keywords: technology development, decision making, evaluation, risk management. 1.

  15. Planning a Teaching Demonstration

    Create a handout, ask questions. Start with a relevant hook to grab students' interest (an alarming statistic, a current event, a thought-provoking question, etc.) If you're teaching a small group of students, bring index cards and black sharpies. Have the students write their names on the cards and set them up on their desks.

  16. Technology in Schools

    "Technology's a lot like the rungs on a ladder. Once you reach one level, there's another one higher up to aspire to." David Dwyer, former director of the Apple Classrooms of Tomorrow project and currently Apple's Director of Education Technology.Excerpted from "Taking Stock: What Does the Research Say about Technology's Impact on Education?"

  17. E-Assessment: Developing Technology Application Prototype for Easy

    PDF | On Jan 1, 2019, Nurul Nisa Omar and others published E-Assessment: Developing Technology Application Prototype for Easy Evaluation of Students' Assignments | Find, read and cite all the ...

  18. Using Technology for Evaluation and Assessment

    Stage 2: Determine assessment evidence by identifying how we will know if students have achieved the desired results. Stage 3: Plan learning experiences and instruction that address the transfer of learning, meaning-making, and acquisition. McTighe and Wiggins ' s primarily focused has been on K-12 education.

  19. Graduate Education Technology and Diversity EDUC-629 Summer D

    Technology Application Professional Development and Lesson Plans Assignment The candidate will develop a professional development plan describing how teachers will be prepared to use the new technology in their classroom instruction. Then, the candidate will create two formal lesson plans demonstrating how the students will use the technology during lessons.

  20. Using technology and structured peer reviews to enhance students

    Peermark, a peer-review application by Turnitin and located in the university LMS, was used. In this peer review system, students need to submit their best-effort draft online first; thereafter, they are guided individually in what and how to assess using criteria written by the instructor. Separate criteria were prepared for each assignment.

  21. Teaching Demonstrations

    Many four-year institutions require a teaching demonstration in addition to a research/job talk while two-year or teaching-oriented institutions may only require a teaching demonstration (Gannon, 2019). However, the structure, format, and length of teaching demonstrations differ between four-year and two-year institutions.

  22. Technology Lesson Demonstration Assignment...

    EDUC 630 T ECHNOLOGY L ESSON D EMONSTRATION A SSIGNMENT I NSTRUCTIONS O VERVIEW The objective of this assignment is to practice the skill of selecting an appropriate, current (use a technology tool developed within the last five years) technology tool and align it to one of your instructional lessons. I NSTRUCTIONS To begin, select a lesson that needs a technology component added or updated.

  23. Technology Lesson Demonstration.docx

    4 Technology Lesson Demonstration 2) Stimulate Recall of Prior Knowledge a) As a whole group we will work on a 5-question review sheet. i) The teacher will give the students 2-3 minutes to finish one question and then go over it as a class. ii) Continue this for all 5 questions. 3) Presentation of Content/ Student Involvement a) Students will be broken up into 4 groups for review centers.