How technology is reinventing education

Stanford Graduate School of Education Dean Dan Schwartz and other education scholars weigh in on what's next for some of the technology trends taking center stage in the classroom.

articles about educational tools

Image credit: Claire Scully

New advances in technology are upending education, from the recent debut of new artificial intelligence (AI) chatbots like ChatGPT to the growing accessibility of virtual-reality tools that expand the boundaries of the classroom. For educators, at the heart of it all is the hope that every learner gets an equal chance to develop the skills they need to succeed. But that promise is not without its pitfalls.

“Technology is a game-changer for education – it offers the prospect of universal access to high-quality learning experiences, and it creates fundamentally new ways of teaching,” said Dan Schwartz, dean of Stanford Graduate School of Education (GSE), who is also a professor of educational technology at the GSE and faculty director of the Stanford Accelerator for Learning . “But there are a lot of ways we teach that aren’t great, and a big fear with AI in particular is that we just get more efficient at teaching badly. This is a moment to pay attention, to do things differently.”

For K-12 schools, this year also marks the end of the Elementary and Secondary School Emergency Relief (ESSER) funding program, which has provided pandemic recovery funds that many districts used to invest in educational software and systems. With these funds running out in September 2024, schools are trying to determine their best use of technology as they face the prospect of diminishing resources.

Here, Schwartz and other Stanford education scholars weigh in on some of the technology trends taking center stage in the classroom this year.

AI in the classroom

In 2023, the big story in technology and education was generative AI, following the introduction of ChatGPT and other chatbots that produce text seemingly written by a human in response to a question or prompt. Educators immediately worried that students would use the chatbot to cheat by trying to pass its writing off as their own. As schools move to adopt policies around students’ use of the tool, many are also beginning to explore potential opportunities – for example, to generate reading assignments or coach students during the writing process.

AI can also help automate tasks like grading and lesson planning, freeing teachers to do the human work that drew them into the profession in the first place, said Victor Lee, an associate professor at the GSE and faculty lead for the AI + Education initiative at the Stanford Accelerator for Learning. “I’m heartened to see some movement toward creating AI tools that make teachers’ lives better – not to replace them, but to give them the time to do the work that only teachers are able to do,” he said. “I hope to see more on that front.”

He also emphasized the need to teach students now to begin questioning and critiquing the development and use of AI. “AI is not going away,” said Lee, who is also director of CRAFT (Classroom-Ready Resources about AI for Teaching), which provides free resources to help teach AI literacy to high school students across subject areas. “We need to teach students how to understand and think critically about this technology.”

Immersive environments

The use of immersive technologies like augmented reality, virtual reality, and mixed reality is also expected to surge in the classroom, especially as new high-profile devices integrating these realities hit the marketplace in 2024.

The educational possibilities now go beyond putting on a headset and experiencing life in a distant location. With new technologies, students can create their own local interactive 360-degree scenarios, using just a cell phone or inexpensive camera and simple online tools.

“This is an area that’s really going to explode over the next couple of years,” said Kristen Pilner Blair, director of research for the Digital Learning initiative at the Stanford Accelerator for Learning, which runs a program exploring the use of virtual field trips to promote learning. “Students can learn about the effects of climate change, say, by virtually experiencing the impact on a particular environment. But they can also become creators, documenting and sharing immersive media that shows the effects where they live.”

Integrating AI into virtual simulations could also soon take the experience to another level, Schwartz said. “If your VR experience brings me to a redwood tree, you could have a window pop up that allows me to ask questions about the tree, and AI can deliver the answers.”

Gamification

Another trend expected to intensify this year is the gamification of learning activities, often featuring dynamic videos with interactive elements to engage and hold students’ attention.

“Gamification is a good motivator, because one key aspect is reward, which is very powerful,” said Schwartz. The downside? Rewards are specific to the activity at hand, which may not extend to learning more generally. “If I get rewarded for doing math in a space-age video game, it doesn’t mean I’m going to be motivated to do math anywhere else.”

Gamification sometimes tries to make “chocolate-covered broccoli,” Schwartz said, by adding art and rewards to make speeded response tasks involving single-answer, factual questions more fun. He hopes to see more creative play patterns that give students points for rethinking an approach or adapting their strategy, rather than only rewarding them for quickly producing a correct response.

Data-gathering and analysis

The growing use of technology in schools is producing massive amounts of data on students’ activities in the classroom and online. “We’re now able to capture moment-to-moment data, every keystroke a kid makes,” said Schwartz – data that can reveal areas of struggle and different learning opportunities, from solving a math problem to approaching a writing assignment.

But outside of research settings, he said, that type of granular data – now owned by tech companies – is more likely used to refine the design of the software than to provide teachers with actionable information.

The promise of personalized learning is being able to generate content aligned with students’ interests and skill levels, and making lessons more accessible for multilingual learners and students with disabilities. Realizing that promise requires that educators can make sense of the data that’s being collected, said Schwartz – and while advances in AI are making it easier to identify patterns and findings, the data also needs to be in a system and form educators can access and analyze for decision-making. Developing a usable infrastructure for that data, Schwartz said, is an important next step.

With the accumulation of student data comes privacy concerns: How is the data being collected? Are there regulations or guidelines around its use in decision-making? What steps are being taken to prevent unauthorized access? In 2023 K-12 schools experienced a rise in cyberattacks, underscoring the need to implement strong systems to safeguard student data.

Technology is “requiring people to check their assumptions about education,” said Schwartz, noting that AI in particular is very efficient at replicating biases and automating the way things have been done in the past, including poor models of instruction. “But it’s also opening up new possibilities for students producing material, and for being able to identify children who are not average so we can customize toward them. It’s an opportunity to think of entirely new ways of teaching – this is the path I hope to see.”

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 12 February 2024

Education reform and change driven by digital technology: a bibliometric study from a global perspective

  • Chengliang Wang 1 ,
  • Xiaojiao Chen 1 ,
  • Teng Yu   ORCID: orcid.org/0000-0001-5198-7261 2 , 3 ,
  • Yidan Liu 1 , 4 &
  • Yuhui Jing 1  

Humanities and Social Sciences Communications volume  11 , Article number:  256 ( 2024 ) Cite this article

2783 Accesses

1 Citations

1 Altmetric

Metrics details

  • Development studies
  • Science, technology and society

Amidst the global digital transformation of educational institutions, digital technology has emerged as a significant area of interest among scholars. Such technologies have played an instrumental role in enhancing learner performance and improving the effectiveness of teaching and learning. These digital technologies also ensure the sustainability and stability of education during the epidemic. Despite this, a dearth of systematic reviews exists regarding the current state of digital technology application in education. To address this gap, this study utilized the Web of Science Core Collection as a data source (specifically selecting the high-quality SSCI and SCIE) and implemented a topic search by setting keywords, yielding 1849 initial publications. Furthermore, following the PRISMA guidelines, we refined the selection to 588 high-quality articles. Using software tools such as CiteSpace, VOSviewer, and Charticulator, we reviewed these 588 publications to identify core authors (such as Selwyn, Henderson, Edwards), highly productive countries/regions (England, Australia, USA), key institutions (Monash University, Australian Catholic University), and crucial journals in the field ( Education and Information Technologies , Computers & Education , British Journal of Educational Technology ). Evolutionary analysis reveals four developmental periods in the research field of digital technology education application: the embryonic period, the preliminary development period, the key exploration, and the acceleration period of change. The study highlights the dual influence of technological factors and historical context on the research topic. Technology is a key factor in enabling education to transform and upgrade, and the context of the times is an important driving force in promoting the adoption of new technologies in the education system and the transformation and upgrading of education. Additionally, the study identifies three frontier hotspots in the field: physical education, digital transformation, and professional development under the promotion of digital technology. This study presents a clear framework for digital technology application in education, which can serve as a valuable reference for researchers and educational practitioners concerned with digital technology education application in theory and practice.

Similar content being viewed by others

articles about educational tools

A bibliometric analysis of knowledge mapping in Chinese education digitalization research from 2012 to 2022

Rui Shi & XiuLan Wan

articles about educational tools

Digital transformation and digital literacy in the context of complexity within higher education institutions: a systematic literature review

Silvia Farias-Gaytan, Ignacio Aguaded & Maria-Soledad Ramirez-Montoya

articles about educational tools

Education big data and learning analytics: a bibliometric analysis

Shaza Arissa Samsul, Noraffandy Yahaya & Hassan Abuhassna

Introduction

Digital technology has become an essential component of modern education, facilitating the extension of temporal and spatial boundaries and enriching the pedagogical contexts (Selwyn and Facer, 2014 ). The advent of mobile communication technology has enabled learning through social media platforms (Szeto et al. 2015 ; Pires et al. 2022 ), while the advancement of augmented reality technology has disrupted traditional conceptions of learning environments and spaces (Perez-Sanagustin et al., 2014 ; Kyza and Georgiou, 2018 ). A wide range of digital technologies has enabled learning to become a norm in various settings, including the workplace (Sjöberg and Holmgren, 2021 ), home (Nazare et al. 2022 ), and online communities (Tang and Lam, 2014 ). Education is no longer limited to fixed locations and schedules, but has permeated all aspects of life, allowing learning to continue at any time and any place (Camilleri and Camilleri, 2016 ; Selwyn and Facer, 2014 ).

The advent of digital technology has led to the creation of several informal learning environments (Greenhow and Lewin, 2015 ) that exhibit divergent form, function, features, and patterns in comparison to conventional learning environments (Nygren et al. 2019 ). Consequently, the associated teaching and learning processes, as well as the strategies for the creation, dissemination, and acquisition of learning resources, have undergone a complete overhaul. The ensuing transformations have posed a myriad of novel issues, such as the optimal structuring of teaching methods by instructors and the adoption of appropriate learning strategies by students in the new digital technology environment. Consequently, an examination of the principles that underpin effective teaching and learning in this environment is a topic of significant interest to numerous scholars engaged in digital technology education research.

Over the course of the last two decades, digital technology has made significant strides in the field of education, notably in extending education time and space and creating novel educational contexts with sustainability. Despite research attempts to consolidate the application of digital technology in education, previous studies have only focused on specific aspects of digital technology, such as Pinto and Leite’s ( 2020 ) investigation into digital technology in higher education and Mustapha et al.’s ( 2021 ) examination of the role and value of digital technology in education during the pandemic. While these studies have provided valuable insights into the practical applications of digital technology in particular educational domains, they have not comprehensively explored the macro-mechanisms and internal logic of digital technology implementation in education. Additionally, these studies were conducted over a relatively brief period, making it challenging to gain a comprehensive understanding of the macro-dynamics and evolutionary process of digital technology in education. Some studies have provided an overview of digital education from an educational perspective but lack a precise understanding of technological advancement and change (Yang et al. 2022 ). Therefore, this study seeks to employ a systematic scientific approach to collate relevant research from 2000 to 2022, comprehend the internal logic and development trends of digital technology in education, and grasp the outstanding contribution of digital technology in promoting the sustainability of education in time and space. In summary, this study aims to address the following questions:

RQ1: Since the turn of the century, what is the productivity distribution of the field of digital technology education application research in terms of authorship, country/region, institutional and journal level?

RQ2: What is the development trend of research on the application of digital technology in education in the past two decades?

RQ3: What are the current frontiers of research on the application of digital technology in education?

Literature review

Although the term “digital technology” has become ubiquitous, a unified definition has yet to be agreed upon by scholars. Because the meaning of the word digital technology is closely related to the specific context. Within the educational research domain, Selwyn’s ( 2016 ) definition is widely favored by scholars (Pinto and Leite, 2020 ). Selwyn ( 2016 ) provides a comprehensive view of various concrete digital technologies and their applications in education through ten specific cases, such as immediate feedback in classes, orchestrating teaching, and community learning. Through these specific application scenarios, Selwyn ( 2016 ) argues that digital technology encompasses technologies associated with digital devices, including but not limited to tablets, smartphones, computers, and social media platforms (such as Facebook and YouTube). Furthermore, Further, the behavior of accessing the internet at any location through portable devices can be taken as an extension of the behavior of applying digital technology.

The evolving nature of digital technology has significant implications in the field of education. In the 1890s, the focus of digital technology in education was on comprehending the nuances of digital space, digital culture, and educational methodologies, with its connotations aligned more towards the idea of e-learning. The advent and subsequent widespread usage of mobile devices since the dawn of the new millennium have been instrumental in the rapid expansion of the concept of digital technology. Notably, mobile learning devices such as smartphones and tablets, along with social media platforms, have become integral components of digital technology (Conole and Alevizou, 2010 ; Batista et al. 2016 ). In recent times, the burgeoning application of AI technology in the education sector has played a vital role in enriching the digital technology lexicon (Banerjee et al. 2021 ). ChatGPT, for instance, is identified as a novel educational technology that has immense potential to revolutionize future education (Rospigliosi, 2023 ; Arif, Munaf and Ul-Haque, 2023 ).

Pinto and Leite ( 2020 ) conducted a comprehensive macroscopic survey of the use of digital technologies in the education sector and identified three distinct categories, namely technologies for assessment and feedback, mobile technologies, and Information Communication Technologies (ICT). This classification criterion is both macroscopic and highly condensed. In light of the established concept definitions of digital technology in the educational research literature, this study has adopted the characterizations of digital technology proposed by Selwyn ( 2016 ) and Pinto and Leite ( 2020 ) as crucial criteria for analysis and research inclusion. Specifically, this criterion encompasses several distinct types of digital technologies, including Information and Communication Technologies (ICT), Mobile tools, eXtended Reality (XR) Technologies, Assessment and Feedback systems, Learning Management Systems (LMS), Publish and Share tools, Collaborative systems, Social media, Interpersonal Communication tools, and Content Aggregation tools.

Methodology and materials

Research method: bibliometric.

The research on econometric properties has been present in various aspects of human production and life, yet systematic scientific theoretical guidance has been lacking, resulting in disorganization. In 1969, British scholar Pritchard ( 1969 ) proposed “bibliometrics,” which subsequently emerged as an independent discipline in scientific quantification research. Initially, Pritchard defined bibliometrics as “the application of mathematical and statistical methods to books and other media of communication,” however, the definition was not entirely rigorous. To remedy this, Hawkins ( 2001 ) expanded Pritchard’s definition to “the quantitative analysis of the bibliographic features of a body of literature.” De Bellis further clarified the objectives of bibliometrics, stating that it aims to analyze and identify patterns in literature, such as the most productive authors, institutions, countries, and journals in scientific disciplines, trends in literary production over time, and collaboration networks (De Bellis, 2009 ). According to Garfield ( 2006 ), bibliometric research enables the examination of the history and structure of a field, the flow of information within the field, the impact of journals, and the citation status of publications over a longer time scale. All of these definitions illustrate the unique role of bibliometrics as a research method for evaluating specific research fields.

This study uses CiteSpace, VOSviewer, and Charticulator to analyze data and create visualizations. Each of these three tools has its own strengths and can complement each other. CiteSpace and VOSviewer use set theory and probability theory to provide various visualization views in fields such as keywords, co-occurrence, and co-authors. They are easy to use and produce visually appealing graphics (Chen, 2006 ; van Eck and Waltman, 2009 ) and are currently the two most widely used bibliometric tools in the field of visualization (Pan et al. 2018 ). In this study, VOSviewer provided the data necessary for the Performance Analysis; Charticulator was then used to redraw using the tabular data exported from VOSviewer (for creating the chord diagram of country collaboration); this was to complement the mapping process, while CiteSpace was primarily utilized to generate keyword maps and conduct burst word analysis.

Data retrieval

This study selected documents from the Science Citation Index Expanded (SCIE) and Social Science Citation Index (SSCI) in the Web of Science Core Collection as the data source, for the following reasons:

(1) The Web of Science Core Collection, as a high-quality digital literature resource database, has been widely accepted by many researchers and is currently considered the most suitable database for bibliometric analysis (Jing et al. 2023a ). Compared to other databases, Web of Science provides more comprehensive data information (Chen et al. 2022a ), and also provides data formats suitable for analysis using VOSviewer and CiteSpace (Gaviria-Marin et al. 2019 ).

(2) The application of digital technology in the field of education is an interdisciplinary research topic, involving technical knowledge literature belonging to the natural sciences and education-related literature belonging to the social sciences. Therefore, it is necessary to select Science Citation Index Expanded (SCIE) and Social Science Citation Index (SSCI) as the sources of research data, ensuring the comprehensiveness of data while ensuring the reliability and persuasiveness of bibliometric research (Hwang and Tsai, 2011 ; Wang et al. 2022 ).

After establishing the source of research data, it is necessary to determine a retrieval strategy (Jing et al. 2023b ). The choice of a retrieval strategy should consider a balance between the breadth and precision of the search formula. That is to say, it should encompass all the literature pertaining to the research topic while excluding irrelevant documents as much as possible. In light of this, this study has set a retrieval strategy informed by multiple related papers (Mustapha et al. 2021 ; Luo et al. 2021 ). The research by Mustapha et al. ( 2021 ) guided us in selecting keywords (“digital” AND “technolog*”) to target digital technology, while Luo et al. ( 2021 ) informed the selection of terms (such as “instruct*,” “teach*,” and “education”) to establish links with the field of education. Then, based on the current application of digital technology in the educational domain and the scope of selection criteria, we constructed the final retrieval strategy. Following the general patterns of past research (Jing et al. 2023a , 2023b ), we conducted a specific screening using the topic search (Topics, TS) function in Web of Science. For the specific criteria used in the screening for this study, please refer to Table 1 .

Literature screening

Literature acquired through keyword searches may contain ostensibly related yet actually unrelated works. Therefore, to ensure the close relevance of literature included in the analysis to the research topic, it is often necessary to perform a manual screening process to identify the final literature to be analyzed, subsequent to completing the initial literature search.

The manual screening process consists of two steps. Initially, irrelevant literature is weeded out based on the title and abstract, with two members of the research team involved in this phase. This stage lasted about one week, resulting in 1106 articles being retained. Subsequently, a comprehensive review of the full text is conducted to accurately identify the literature required for the study. To carry out the second phase of manual screening effectively and scientifically, and to minimize the potential for researcher bias, the research team established the inclusion criteria presented in Table 2 . Three members were engaged in this phase, which took approximately 2 weeks, culminating in the retention of 588 articles after meticulous screening. The entire screening process is depicted in Fig. 1 , adhering to the PRISMA guidelines (Page et al. 2021 ).

figure 1

The process of obtaining and filtering the necessary literature data for research.

Data standardization

Nguyen and Hallinger ( 2020 ) pointed out that raw data extracted from scientific databases often contains multiple expressions of the same term, and not addressing these synonymous expressions could affect research results in bibliometric analysis. For instance, in the original data, the author list may include “Tsai, C. C.” and “Tsai, C.-C.”, while the keyword list may include “professional-development” and “professional development,” which often require merging. Therefore, before analyzing the selected literature, a data disambiguation process is necessary to standardize the data (Strotmann and Zhao, 2012 ; Van Eck and Waltman, 2019 ). This study adopted the data standardization process proposed by Taskin and Al ( 2019 ), mainly including the following standardization operations:

Firstly, the author and source fields in the data are corrected and standardized to differentiate authors with similar names.

Secondly, the study checks whether the journals to which the literature belongs have been renamed in the past over 20 years, so as to avoid the influence of periodical name change on the analysis results.

Finally, the keyword field is standardized by unifying parts of speech and singular/plural forms of keywords, which can help eliminate redundant entries in the knowledge graph.

Performance analysis (RQ1)

This section offers a thorough and detailed analysis of the state of research in the field of digital technology education. By utilizing descriptive statistics and visual maps, it provides a comprehensive overview of the development trends, authors, countries, institutions, and journal distribution within the field. The insights presented in this section are of great significance in advancing our understanding of the current state of research in this field and identifying areas for further investigation. The use of visual aids to display inter-country cooperation and the evolution of the field adds to the clarity and coherence of the analysis.

Time trend of the publications

To understand a research field, it is first necessary to understand the most basic quantitative information, among which the change in the number of publications per year best reflects the development trend of a research field. Figure 2 shows the distribution of publication dates.

figure 2

Time trend of the publications on application of digital technology in education.

From the Fig. 2 , it can be seen that the development of this field over the past over 20 years can be roughly divided into three stages. The first stage was from 2000 to 2007, during which the number of publications was relatively low. Due to various factors such as technological maturity, the academic community did not pay widespread attention to the role of digital technology in expanding the scope of teaching and learning. The second stage was from 2008 to 2019, during which the overall number of publications showed an upward trend, and the development of the field entered an accelerated period, attracting more and more scholars’ attention. The third stage was from 2020 to 2022, during which the number of publications stabilized at around 100. During this period, the impact of the pandemic led to a large number of scholars focusing on the role of digital technology in education during the pandemic, and research on the application of digital technology in education became a core topic in social science research.

Analysis of authors

An analysis of the author’s publication volume provides information about the representative scholars and core research strengths of a research area. Table 3 presents information on the core authors in adaptive learning research, including name, publication number, and average number of citations per article (based on the analysis and statistics from VOSviewer).

Variations in research foci among scholars abound. Within the field of digital technology education application research over the past two decades, Neil Selwyn stands as the most productive author, having published 15 papers garnering a total of 1027 citations, resulting in an average of 68.47 citations per paper. As a Professor at the Faculty of Education at Monash University, Selwyn concentrates on exploring the application of digital technology in higher education contexts (Selwyn et al. 2021 ), as well as related products in higher education such as Coursera, edX, and Udacity MOOC platforms (Bulfin et al. 2014 ). Selwyn’s contributions to the educational sociology perspective include extensive research on the impact of digital technology on education, highlighting the spatiotemporal extension of educational processes and practices through technological means as the greatest value of educational technology (Selwyn, 2012 ; Selwyn and Facer, 2014 ). In addition, he provides a blueprint for the development of future schools in 2030 based on the present impact of digital technology on education (Selwyn et al. 2019 ). The second most productive author in this field, Henderson, also offers significant contributions to the understanding of the important value of digital technology in education, specifically in the higher education setting, with a focus on the impact of the pandemic (Henderson et al. 2015 ; Cohen et al. 2022 ). In contrast, Edwards’ research interests focus on early childhood education, particularly the application of digital technology in this context (Edwards, 2013 ; Bird and Edwards, 2015 ). Additionally, on the technical level, Edwards also mainly prefers digital game technology, because it is a digital technology that children are relatively easy to accept (Edwards, 2015 ).

Analysis of countries/regions and organization

The present study aimed to ascertain the leading countries in digital technology education application research by analyzing 75 countries related to 558 works of literature. Table 4 depicts the top ten countries that have contributed significantly to this field in terms of publication count (based on the analysis and statistics from VOSviewer). Our analysis of Table 4 data shows that England emerged as the most influential country/region, with 92 published papers and 2401 citations. Australia and the United States secured the second and third ranks, respectively, with 90 papers (2187 citations) and 70 papers (1331 citations) published. Geographically, most of the countries featured in the top ten publication volumes are situated in Australia, North America, and Europe, with China being the only exception. Notably, all these countries, except China, belong to the group of developed nations, suggesting that economic strength is a prerequisite for fostering research in the digital technology education application field.

This study presents a visual representation of the publication output and cooperation relationships among different countries in the field of digital technology education application research. Specifically, a chord diagram is employed to display the top 30 countries in terms of publication output, as depicted in Fig. 3 . The chord diagram is composed of nodes and chords, where the nodes are positioned as scattered points along the circumference, and the length of each node corresponds to the publication output, with longer lengths indicating higher publication output. The chords, on the other hand, represent the cooperation relationships between any two countries, and are weighted based on the degree of closeness of the cooperation, with wider chords indicating closer cooperation. Through the analysis of the cooperation relationships, the findings suggest that the main publishing countries in this field are engaged in cooperative relationships with each other, indicating a relatively high level of international academic exchange and research internationalization.

figure 3

In the diagram, nodes are scattered along the circumference of a circle, with the length of each node representing the volume of publications. The weighted arcs connecting any two points on the circle are known as chords, representing the collaborative relationship between the two, with the width of the arc indicating the closeness of the collaboration.

Further analyzing Fig. 3 , we can extract more valuable information, enabling a deeper understanding of the connections between countries in the research field of digital technology in educational applications. It is evident that certain countries, such as the United States, China, and England, display thicker connections, indicating robust collaborative relationships in terms of productivity. These thicker lines signify substantial mutual contributions and shared objectives in certain sectors or fields, highlighting the interconnectedness and global integration in these areas. By delving deeper, we can also explore potential future collaboration opportunities through the chord diagram, identifying possible partners to propel research and development in this field. In essence, the chord diagram successfully encapsulates and conveys the multi-dimensionality of global productivity and cooperation, allowing for a comprehensive understanding of the intricate inter-country relationships and networks in a global context, providing valuable guidance and insights for future research and collaborations.

An in-depth examination of the publishing institutions is provided in Table 5 , showcasing the foremost 10 institutions ranked by their publication volume. Notably, Monash University and Australian Catholic University, situated in Australia, have recorded the most prolific publications within the digital technology education application realm, with 22 and 10 publications respectively. Moreover, the University of Oslo from Norway is featured among the top 10 publishing institutions, with an impressive average citation count of 64 per publication. It is worth highlighting that six institutions based in the United Kingdom were also ranked within the top 10 publishing institutions, signifying their leading position in this area of research.

Analysis of journals

Journals are the main carriers for publishing high-quality papers. Some scholars point out that the two key factors to measure the influence of journals in the specified field are the number of articles published and the number of citations. The more papers published in a magazine and the more citations, the greater its influence (Dzikowski, 2018 ). Therefore, this study utilized VOSviewer to statistically analyze the top 10 journals with the most publications in the field of digital technology in education and calculated the average citations per article (see Table 6 ).

Based on Table 6 , it is apparent that the highest number of articles in the domain of digital technology in education research were published in Education and Information Technologies (47 articles), Computers & Education (34 articles), and British Journal of Educational Technology (32 articles), indicating a higher article output compared to other journals. This underscores the fact that these three journals concentrate more on the application of digital technology in education. Furthermore, several other journals, such as Technology Pedagogy and Education and Sustainability, have published more than 15 articles in this domain. Sustainability represents the open access movement, which has notably facilitated research progress in this field, indicating that the development of open access journals in recent years has had a significant impact. Although there is still considerable disagreement among scholars on the optimal approach to achieve open access, the notion that research outcomes should be accessible to all is widely recognized (Huang et al. 2020 ). On further analysis of the research fields to which these journals belong, except for Sustainability, it is evident that they all pertain to educational technology, thus providing a qualitative definition of the research area of digital technology education from the perspective of journals.

Temporal keyword analysis: thematic evolution (RQ2)

The evolution of research themes is a dynamic process, and previous studies have attempted to present the developmental trajectory of fields by drawing keyword networks in phases (Kumar et al. 2021 ; Chen et al. 2022b ). To understand the shifts in research topics across different periods, this study follows past research and, based on the significant changes in the research field and corresponding technological advancements during the outlined periods, divides the timeline into four stages (the first stage from January 2000 to December 2005, the second stage from January 2006 to December 2011, the third stage from January 2012 to December 2017; and the fourth stage from January 2018 to December 2022). The division into these four stages was determined through a combination of bibliometric analysis and literature review, which presented a clear trajectory of the field’s development. The research analyzes the keyword networks for each time period (as there are only three articles in the first stage, it was not possible to generate an appropriate keyword co-occurrence map, hence only the keyword co-occurrence maps from the second to the fourth stages are provided), to understand the evolutionary track of the digital technology education application research field over time.

2000.1–2005.12: germination period

From January 2000 to December 2005, digital technology education application research was in its infancy. Only three studies focused on digital technology, all of which were related to computers. Due to the popularity of computers, the home became a new learning environment, highlighting the important role of digital technology in expanding the scope of learning spaces (Sutherland et al. 2000 ). In specific disciplines and contexts, digital technology was first favored in medical clinical practice, becoming an important tool for supporting the learning of clinical knowledge and practice (Tegtmeyer et al. 2001 ; Durfee et al. 2003 ).

2006.1–2011.12: initial development period

Between January 2006 and December 2011, it was the initial development period of digital technology education research. Significant growth was observed in research related to digital technology, and discussions and theoretical analyses about “digital natives” emerged. During this phase, scholars focused on the debate about “how to use digital technology reasonably” and “whether current educational models and school curriculum design need to be adjusted on a large scale” (Bennett and Maton, 2010 ; Selwyn, 2009 ; Margaryan et al. 2011 ). These theoretical and speculative arguments provided a unique perspective on the impact of cognitive digital technology on education and teaching. As can be seen from the vocabulary such as “rethinking”, “disruptive pedagogy”, and “attitude” in Fig. 4 , many scholars joined the calm reflection and analysis under the trend of digital technology (Laurillard, 2008 ; Vratulis et al. 2011 ). During this phase, technology was still undergoing dramatic changes. The development of mobile technology had already caught the attention of many scholars (Wong et al. 2011 ), but digital technology represented by computers was still very active (Selwyn et al. 2011 ). The change in technological form would inevitably lead to educational transformation. Collins and Halverson ( 2010 ) summarized the prospects and challenges of using digital technology for learning and educational practices, believing that digital technology would bring a disruptive revolution to the education field and bring about a new educational system. In addition, the term “teacher education” in Fig. 4 reflects the impact of digital technology development on teachers. The rapid development of technology has widened the generation gap between teachers and students. To ensure smooth communication between teachers and students, teachers must keep up with the trend of technological development and establish a lifelong learning concept (Donnison, 2009 ).

figure 4

In the diagram, each node represents a keyword, with the size of the node indicating the frequency of occurrence of the keyword. The connections represent the co-occurrence relationships between keywords, with a higher frequency of co-occurrence resulting in tighter connections.

2012.1–2017.12: critical exploration period

During the period spanning January 2012 to December 2017, the application of digital technology in education research underwent a significant exploration phase. As can be seen from Fig. 5 , different from the previous stage, the specific elements of specific digital technology have started to increase significantly, including the enrichment of technological contexts, the greater variety of research methods, and the diversification of learning modes. Moreover, the temporal and spatial dimensions of the learning environment were further de-emphasized, as noted in previous literature (Za et al. 2014 ). Given the rapidly accelerating pace of technological development, the education system in the digital era is in urgent need of collaborative evolution and reconstruction, as argued by Davis, Eickelmann, and Zaka ( 2013 ).

figure 5

In the domain of digital technology, social media has garnered substantial scholarly attention as a promising avenue for learning, as noted by Pasquini and Evangelopoulos ( 2016 ). The implementation of social media in education presents several benefits, including the liberation of education from the restrictions of physical distance and time, as well as the erasure of conventional educational boundaries. The user-generated content (UGC) model in social media has emerged as a crucial source for knowledge creation and distribution, with the widespread adoption of mobile devices. Moreover, social networks have become an integral component of ubiquitous learning environments (Hwang et al. 2013 ). The utilization of social media allows individuals to function as both knowledge producers and recipients, which leads to a blurring of the conventional roles of learners and teachers. On mobile platforms, the roles of learners and teachers are not fixed, but instead interchangeable.

In terms of research methodology, the prevalence of empirical studies with survey designs in the field of educational technology during this period is evident from the vocabulary used, such as “achievement,” “acceptance,” “attitude,” and “ict.” in Fig. 5 . These studies aim to understand learners’ willingness to adopt and attitudes towards new technologies, and some seek to investigate the impact of digital technologies on learning outcomes through quasi-experimental designs (Domínguez et al. 2013 ). Among these empirical studies, mobile learning emerged as a hot topic, and this is not surprising. First, the advantages of mobile learning environments over traditional ones have been empirically demonstrated (Hwang et al. 2013 ). Second, learners born around the turn of the century have been heavily influenced by digital technologies and have developed their own learning styles that are more open to mobile devices as a means of learning. Consequently, analyzing mobile learning as a relatively novel mode of learning has become an important issue for scholars in the field of educational technology.

The intervention of technology has led to the emergence of several novel learning modes, with the blended learning model being the most representative one in the current phase. Blended learning, a novel concept introduced in the information age, emphasizes the integration of the benefits of traditional learning methods and online learning. This learning mode not only highlights the prominent role of teachers in guiding, inspiring, and monitoring the learning process but also underlines the importance of learners’ initiative, enthusiasm, and creativity in the learning process. Despite being an early conceptualization, blended learning’s meaning has been expanded by the widespread use of mobile technology and social media in education. The implementation of new technologies, particularly mobile devices, has resulted in the transformation of curriculum design and increased flexibility and autonomy in students’ learning processes (Trujillo Maza et al. 2016 ), rekindling scholarly attention to this learning mode. However, some scholars have raised concerns about the potential drawbacks of the blended learning model, such as its significant impact on the traditional teaching system, the lack of systematic coping strategies and relevant policies in several schools and regions (Moskal et al. 2013 ).

2018.1–2022.12: accelerated transformation period

The period spanning from January 2018 to December 2022 witnessed a rapid transformation in the application of digital technology in education research. The field of digital technology education research reached a peak period of publication, largely influenced by factors such as the COVID-19 pandemic (Yu et al. 2023 ). Research during this period was built upon the achievements, attitudes, and social media of the previous phase, and included more elements that reflect the characteristics of this research field, such as digital literacy, digital competence, and professional development, as depicted in Fig. 6 . Alongside this, scholars’ expectations for the value of digital technology have expanded, and the pursuit of improving learning efficiency and performance is no longer the sole focus. Some research now aims to cultivate learners’ motivation and enhance their self-efficacy by applying digital technology in a reasonable manner, as demonstrated by recent studies (Beardsley et al. 2021 ; Creely et al. 2021 ).

figure 6

The COVID-19 pandemic has emerged as a crucial backdrop for the digital technology’s role in sustaining global education, as highlighted by recent scholarly research (Zhou et al. 2022 ; Pan and Zhang, 2020 ; Mo et al. 2022 ). The online learning environment, which is supported by digital technology, has become the primary battleground for global education (Yu, 2022 ). This social context has led to various studies being conducted, with some scholars positing that the pandemic has impacted the traditional teaching order while also expanding learning possibilities in terms of patterns and forms (Alabdulaziz, 2021 ). Furthermore, the pandemic has acted as a catalyst for teacher teaching and technological innovation, and this viewpoint has been empirically substantiated (Moorhouse and Wong, 2021 ). Additionally, some scholars believe that the pandemic’s push is a crucial driving force for the digital transformation of the education system, serving as an essential mechanism for overcoming the system’s inertia (Romero et al. 2021 ).

The rapid outbreak of the pandemic posed a challenge to the large-scale implementation of digital technologies, which was influenced by a complex interplay of subjective and objective factors. Objective constraints included the lack of infrastructure in some regions to support digital technologies, while subjective obstacles included psychological resistance among certain students and teachers (Moorhouse, 2021 ). These factors greatly impacted the progress of online learning during the pandemic. Additionally, Timotheou et al. ( 2023 ) conducted a comprehensive systematic review of existing research on digital technology use during the pandemic, highlighting the critical role played by various factors such as learners’ and teachers’ digital skills, teachers’ personal attributes and professional development, school leadership and management, and administration in facilitating the digitalization and transformation of schools.

The current stage of research is characterized by the pivotal term “digital literacy,” denoting a growing interest in learners’ attitudes and adoption of emerging technologies. Initially, the term “literacy” was restricted to fundamental abilities and knowledge associated with books and print materials (McMillan, 1996 ). However, with the swift advancement of computers and digital technology, there have been various attempts to broaden the scope of literacy beyond its traditional meaning, including game literacy (Buckingham and Burn, 2007 ), information literacy (Eisenberg, 2008 ), and media literacy (Turin and Friesem, 2020 ). Similarly, digital literacy has emerged as a crucial concept, and Gilster and Glister ( 1997 ) were the first to introduce this concept, referring to the proficiency in utilizing technology and processing digital information in academic, professional, and daily life settings. In practical educational settings, learners who possess higher digital literacy often exhibit an aptitude for quickly mastering digital devices and applying them intelligently to education and teaching (Yu, 2022 ).

The utilization of digital technology in education has undergone significant changes over the past two decades, and has been a crucial driver of educational reform with each new technological revolution. The impact of these changes on the underlying logic of digital technology education applications has been noticeable. From computer technology to more recent developments such as virtual reality (VR), augmented reality (AR), and artificial intelligence (AI), the acceleration in digital technology development has been ongoing. Educational reforms spurred by digital technology development continue to be dynamic, as each new digital innovation presents new possibilities and models for teaching practice. This is especially relevant in the post-pandemic era, where the importance of technological progress in supporting teaching cannot be overstated (Mughal et al. 2022 ). Existing digital technologies have already greatly expanded the dimensions of education in both time and space, while future digital technologies aim to expand learners’ perceptions. Researchers have highlighted the potential of integrated technology and immersive technology in the development of the educational metaverse, which is highly anticipated to create a new dimension for the teaching and learning environment, foster a new value system for the discipline of educational technology, and more effectively and efficiently achieve the grand educational blueprint of the United Nations’ Sustainable Development Goals (Zhang et al. 2022 ; Li and Yu, 2023 ).

Hotspot evolution analysis (RQ3)

The examination of keyword evolution reveals a consistent trend in the advancement of digital technology education application research. The emergence and transformation of keywords serve as indicators of the varying research interests in this field. Thus, the utilization of the burst detection function available in CiteSpace allowed for the identification of the top 10 burst words that exhibited a high level of burst strength. This outcome is illustrated in Table 7 .

According to the results presented in Table 7 , the explosive terminology within the realm of digital technology education research has exhibited a concentration mainly between the years 2018 and 2022. Prior to this time frame, the emerging keywords were limited to “information technology” and “computer”. Notably, among them, computer, as an emergent keyword, has always had a high explosive intensity from 2008 to 2018, which reflects the important position of computer in digital technology and is the main carrier of many digital technologies such as Learning Management Systems (LMS) and Assessment and Feedback systems (Barlovits et al. 2022 ).

Since 2018, an increasing number of research studies have focused on evaluating the capabilities of learners to accept, apply, and comprehend digital technologies. As indicated by the use of terms such as “digital literacy” and “digital skill,” the assessment of learners’ digital literacy has become a critical task. Scholarly efforts have been directed towards the development of literacy assessment tools and the implementation of empirical assessments. Furthermore, enhancing the digital literacy of both learners and educators has garnered significant attention. (Nagle, 2018 ; Yu, 2022 ). Simultaneously, given the widespread use of various digital technologies in different formal and informal learning settings, promoting learners’ digital skills has become a crucial objective for contemporary schools (Nygren et al. 2019 ; Forde and OBrien, 2022 ).

Since 2020, the field of applied research on digital technology education has witnessed the emergence of three new hotspots, all of which have been affected to some extent by the pandemic. Firstly, digital technology has been widely applied in physical education, which is one of the subjects that has been severely affected by the pandemic (Parris et al. 2022 ; Jiang and Ning, 2022 ). Secondly, digital transformation has become an important measure for most schools, especially higher education institutions, to cope with the impact of the pandemic globally (García-Morales et al. 2021 ). Although the concept of digital transformation was proposed earlier, the COVID-19 pandemic has greatly accelerated this transformation process. Educational institutions must carefully redesign their educational products to face this new situation, providing timely digital learning methods, environments, tools, and support systems that have far-reaching impacts on modern society (Krishnamurthy, 2020 ; Salas-Pilco et al. 2022 ). Moreover, the professional development of teachers has become a key mission of educational institutions in the post-pandemic era. Teachers need to have a certain level of digital literacy and be familiar with the tools and online teaching resources used in online teaching, which has become a research hotspot today. Organizing digital skills training for teachers to cope with the application of emerging technologies in education is an important issue for teacher professional development and lifelong learning (Garzón-Artacho et al. 2021 ). As the main organizers and practitioners of emergency remote teaching (ERT) during the pandemic, teachers must put cognitive effort into their professional development to ensure effective implementation of ERT (Romero-Hall and Jaramillo Cherrez, 2022 ).

The burst word “digital transformation” reveals that we are in the midst of an ongoing digital technology revolution. With the emergence of innovative digital technologies such as ChatGPT and Microsoft 365 Copilot, technology trends will continue to evolve, albeit unpredictably. While the impact of these advancements on school education remains uncertain, it is anticipated that the widespread integration of technology will significantly affect the current education system. Rejecting emerging technologies without careful consideration is unwise. Like any revolution, the technological revolution in the education field has both positive and negative aspects. Detractors argue that digital technology disrupts learning and memory (Baron, 2021 ) or causes learners to become addicted and distracted from learning (Selwyn and Aagaard, 2020 ). On the other hand, the prudent use of digital technology in education offers a glimpse of a golden age of open learning. Educational leaders and practitioners have the opportunity to leverage cutting-edge digital technologies to address current educational challenges and develop a rational path for the sustainable and healthy growth of education.

Discussion on performance analysis (RQ1)

The field of digital technology education application research has experienced substantial growth since the turn of the century, a phenomenon that is quantifiably apparent through an analysis of authorship, country/region contributions, and institutional engagement. This expansion reflects the increased integration of digital technologies in educational settings and the heightened scholarly interest in understanding and optimizing their use.

Discussion on authorship productivity in digital technology education research

The authorship distribution within digital technology education research is indicative of the field’s intellectual structure and depth. A primary figure in this domain is Neil Selwyn, whose substantial citation rate underscores the profound impact of his work. His focus on the implications of digital technology in higher education and educational sociology has proven to be seminal. Selwyn’s research trajectory, especially the exploration of spatiotemporal extensions of education through technology, provides valuable insights into the multifaceted role of digital tools in learning processes (Selwyn et al. 2019 ).

Other notable contributors, like Henderson and Edwards, present diversified research interests, such as the impact of digital technologies during the pandemic and their application in early childhood education, respectively. Their varied focuses highlight the breadth of digital technology education research, encompassing pedagogical innovation, technological adaptation, and policy development.

Discussion on country/region-level productivity and collaboration

At the country/region level, the United Kingdom, specifically England, emerges as a leading contributor with 92 published papers and a significant citation count. This is closely followed by Australia and the United States, indicating a strong English-speaking research axis. Such geographical concentration of scholarly output often correlates with investment in research and development, technological infrastructure, and the prevalence of higher education institutions engaging in cutting-edge research.

China’s notable inclusion as the only non-Western country among the top contributors to the field suggests a growing research capacity and interest in digital technology in education. However, the lower average citation per paper for China could reflect emerging engagement or different research focuses that may not yet have achieved the same international recognition as Western counterparts.

The chord diagram analysis furthers this understanding, revealing dense interconnections between countries like the United States, China, and England, which indicates robust collaborations. Such collaborations are fundamental in addressing global educational challenges and shaping international research agendas.

Discussion on institutional-level contributions to digital technology education

Institutional productivity in digital technology education research reveals a constellation of universities driving the field forward. Monash University and the Australian Catholic University have the highest publication output, signaling Australia’s significant role in advancing digital education research. The University of Oslo’s remarkable average citation count per publication indicates influential research contributions, potentially reflecting high-quality studies that resonate with the broader academic community.

The strong showing of UK institutions, including the University of London, The Open University, and the University of Cambridge, reinforces the UK’s prominence in this research field. Such institutions are often at the forefront of pedagogical innovation, benefiting from established research cultures and funding mechanisms that support sustained inquiry into digital education.

Discussion on journal publication analysis

An examination of journal outputs offers a lens into the communicative channels of the field’s knowledge base. Journals such as Education and Information Technologies , Computers & Education , and the British Journal of Educational Technology not only serve as the primary disseminators of research findings but also as indicators of research quality and relevance. The impact factor (IF) serves as a proxy for the quality and influence of these journals within the academic community.

The high citation counts for articles published in Computers & Education suggest that research disseminated through this medium has a wide-reaching impact and is of particular interest to the field. This is further evidenced by its significant IF of 11.182, indicating that the journal is a pivotal platform for seminal work in the application of digital technology in education.

The authorship, regional, and institutional productivity in the field of digital technology education application research collectively narrate the evolution of this domain since the turn of the century. The prominence of certain authors and countries underscores the importance of socioeconomic factors and existing academic infrastructure in fostering research productivity. Meanwhile, the centrality of specific journals as outlets for high-impact research emphasizes the role of academic publishing in shaping the research landscape.

As the field continues to grow, future research may benefit from leveraging the collaborative networks that have been elucidated through this analysis, perhaps focusing on underrepresented regions to broaden the scope and diversity of research. Furthermore, the stabilization of publication numbers in recent years invites a deeper exploration into potential plateaus in research trends or saturation in certain sub-fields, signaling an opportunity for novel inquiries and methodological innovations.

Discussion on the evolutionary trends (RQ2)

The evolution of the research field concerning the application of digital technology in education over the past two decades is a story of convergence, diversification, and transformation, shaped by rapid technological advancements and shifting educational paradigms.

At the turn of the century, the inception of digital technology in education was largely exploratory, with a focus on how emerging computer technologies could be harnessed to enhance traditional learning environments. Research from this early period was primarily descriptive, reflecting on the potential and challenges of incorporating digital tools into the educational setting. This phase was critical in establishing the fundamental discourse that would guide subsequent research, as it set the stage for understanding the scope and impact of digital technology in learning spaces (Wang et al. 2023 ).

As the first decade progressed, the narrative expanded to encompass the pedagogical implications of digital technologies. This was a period of conceptual debates, where terms like “digital natives” and “disruptive pedagogy” entered the academic lexicon, underscoring the growing acknowledgment of digital technology as a transformative force within education (Bennett and Maton, 2010 ). During this time, the research began to reflect a more nuanced understanding of the integration of technology, considering not only its potential to change where and how learning occurred but also its implications for educational equity and access.

In the second decade, with the maturation of internet connectivity and mobile technology, the focus of research shifted from theoretical speculations to empirical investigations. The proliferation of digital devices and the ubiquity of social media influenced how learners interacted with information and each other, prompting a surge in studies that sought to measure the impact of these tools on learning outcomes. The digital divide and issues related to digital literacy became central concerns, as scholars explored the varying capacities of students and educators to engage with technology effectively.

Throughout this period, there was an increasing emphasis on the individualization of learning experiences, facilitated by adaptive technologies that could cater to the unique needs and pacing of learners (Jing et al. 2023a ). This individualization was coupled with a growing recognition of the importance of collaborative learning, both online and offline, and the role of digital tools in supporting these processes. Blended learning models, which combined face-to-face instruction with online resources, emerged as a significant trend, advocating for a balance between traditional pedagogies and innovative digital strategies.

The later years, particularly marked by the COVID-19 pandemic, accelerated the necessity for digital technology in education, transforming it from a supplementary tool to an essential platform for delivering education globally (Mo et al. 2022 ; Mustapha et al. 2021 ). This era brought about an unprecedented focus on online learning environments, distance education, and virtual classrooms. Research became more granular, examining not just the pedagogical effectiveness of digital tools, but also their role in maintaining continuity of education during crises, their impact on teacher and student well-being, and their implications for the future of educational policy and infrastructure.

Across these two decades, the research field has seen a shift from examining digital technology as an external addition to the educational process, to viewing it as an integral component of curriculum design, instructional strategies, and even assessment methods. The emergent themes have broadened from a narrow focus on specific tools or platforms to include wider considerations such as data privacy, ethical use of technology, and the environmental impact of digital tools.

Moreover, the field has moved from considering the application of digital technology in education as a primarily cognitive endeavor to recognizing its role in facilitating socio-emotional learning, digital citizenship, and global competencies. Researchers have increasingly turned their attention to the ways in which technology can support collaborative skills, cultural understanding, and ethical reasoning within diverse student populations.

In summary, the past over twenty years in the research field of digital technology applications in education have been characterized by a progression from foundational inquiries to complex analyses of digital integration. This evolution has mirrored the trajectory of technology itself, from a facilitative tool to a pervasive ecosystem defining contemporary educational experiences. As we look to the future, the field is poised to delve into the implications of emerging technologies like AI, AR, and VR, and their potential to redefine the educational landscape even further. This ongoing metamorphosis suggests that the application of digital technology in education will continue to be a rich area of inquiry, demanding continual adaptation and forward-thinking from educators and researchers alike.

Discussion on the study of research hotspots (RQ3)

The analysis of keyword evolution in digital technology education application research elucidates the current frontiers in the field, reflecting a trajectory that is in tandem with the rapidly advancing digital age. This landscape is sculpted by emergent technological innovations and shaped by the demands of an increasingly digital society.

Interdisciplinary integration and pedagogical transformation

One of the frontiers identified from recent keyword bursts includes the integration of digital technology into diverse educational contexts, particularly noted with the keyword “physical education.” The digitalization of disciplines traditionally characterized by physical presence illustrates the pervasive reach of technology and signifies a push towards interdisciplinary integration where technology is not only a facilitator but also a transformative agent. This integration challenges educators to reconceptualize curriculum delivery to accommodate digital tools that can enhance or simulate the physical aspects of learning.

Digital literacy and skills acquisition

Another pivotal frontier is the focus on “digital literacy” and “digital skill”, which has intensified in recent years. This suggests a shift from mere access to technology towards a comprehensive understanding and utilization of digital tools. In this realm, the emphasis is not only on the ability to use technology but also on critical thinking, problem-solving, and the ethical use of digital resources (Yu, 2022 ). The acquisition of digital literacy is no longer an additive skill but a fundamental aspect of modern education, essential for navigating and contributing to the digital world.

Educational digital transformation

The keyword “digital transformation” marks a significant research frontier, emphasizing the systemic changes that education institutions must undergo to align with the digital era (Romero et al. 2021 ). This transformation includes the redesigning of learning environments, pedagogical strategies, and assessment methods to harness digital technology’s full potential. Research in this area explores the complexity of institutional change, addressing the infrastructural, cultural, and policy adjustments needed for a seamless digital transition.

Engagement and participation

Further exploration into “engagement” and “participation” underscores the importance of student-centered learning environments that are mediated by technology. The current frontiers examine how digital platforms can foster collaboration, inclusivity, and active learning, potentially leading to more meaningful and personalized educational experiences. Here, the use of technology seeks to support the emotional and cognitive aspects of learning, moving beyond the transactional view of education to one that is relational and interactive.

Professional development and teacher readiness

As the field evolves, “professional development” emerges as a crucial area, particularly in light of the pandemic which necessitated emergency remote teaching. The need for teacher readiness in a digital age is a pressing frontier, with research focusing on the competencies required for educators to effectively integrate technology into their teaching practices. This includes familiarity with digital tools, pedagogical innovation, and an ongoing commitment to personal and professional growth in the digital domain.

Pandemic as a catalyst

The recent pandemic has acted as a catalyst for accelerated research and application in this field, particularly in the domains of “digital transformation,” “professional development,” and “physical education.” This period has been a litmus test for the resilience and adaptability of educational systems to continue their operations in an emergency. Research has thus been directed at understanding how digital technologies can support not only continuity but also enhance the quality and reach of education in such contexts.

Ethical and societal considerations

The frontier of digital technology in education is also expanding to consider broader ethical and societal implications. This includes issues of digital equity, data privacy, and the sociocultural impact of technology on learning communities. The research explores how educational technology can be leveraged to address inequities and create more equitable learning opportunities for all students, regardless of their socioeconomic background.

Innovation and emerging technologies

Looking forward, the frontiers are set to be influenced by ongoing and future technological innovations, such as artificial intelligence (AI) (Wu and Yu, 2023 ; Chen et al. 2022a ). The exploration into how these technologies can be integrated into educational practices to create immersive and adaptive learning experiences represents a bold new chapter for the field.

In conclusion, the current frontiers of research on the application of digital technology in education are multifaceted and dynamic. They reflect an overarching movement towards deeper integration of technology in educational systems and pedagogical practices, where the goals are not only to facilitate learning but to redefine it. As these frontiers continue to expand and evolve, they will shape the educational landscape, requiring a concerted effort from researchers, educators, policymakers, and technologists to navigate the challenges and harness the opportunities presented by the digital revolution in education.

Conclusions and future research

Conclusions.

The utilization of digital technology in education is a research area that cuts across multiple technical and educational domains and continues to experience dynamic growth due to the continuous progress of technology. In this study, a systematic review of this field was conducted through bibliometric techniques to examine its development trajectory. The primary focus of the review was to investigate the leading contributors, productive national institutions, significant publications, and evolving development patterns. The study’s quantitative analysis resulted in several key conclusions that shed light on this research field’s current state and future prospects.

(1) The research field of digital technology education applications has entered a stage of rapid development, particularly in recent years due to the impact of the pandemic, resulting in a peak of publications. Within this field, several key authors (Selwyn, Henderson, Edwards, etc.) and countries/regions (England, Australia, USA, etc.) have emerged, who have made significant contributions. International exchanges in this field have become frequent, with a high degree of internationalization in academic research. Higher education institutions in the UK and Australia are the core productive forces in this field at the institutional level.

(2) Education and Information Technologies , Computers & Education , and the British Journal of Educational Technology are notable journals that publish research related to digital technology education applications. These journals are affiliated with the research field of educational technology and provide effective communication platforms for sharing digital technology education applications.

(3) Over the past two decades, research on digital technology education applications has progressed from its early stages of budding, initial development, and critical exploration to accelerated transformation, and it is currently approaching maturity. Technological progress and changes in the times have been key driving forces for educational transformation and innovation, and both have played important roles in promoting the continuous development of education.

(4) Influenced by the pandemic, three emerging frontiers have emerged in current research on digital technology education applications, which are physical education, digital transformation, and professional development under the promotion of digital technology. These frontier research hotspots reflect the core issues that the education system faces when encountering new technologies. The evolution of research hotspots shows that technology breakthroughs in education’s original boundaries of time and space create new challenges. The continuous self-renewal of education is achieved by solving one hotspot problem after another.

The present study offers significant practical implications for scholars and practitioners in the field of digital technology education applications. Firstly, it presents a well-defined framework of the existing research in this area, serving as a comprehensive guide for new entrants to the field and shedding light on the developmental trajectory of this research domain. Secondly, the study identifies several contemporary research hotspots, thus offering a valuable decision-making resource for scholars aiming to explore potential research directions. Thirdly, the study undertakes an exhaustive analysis of published literature to identify core journals in the field of digital technology education applications, with Sustainability being identified as a promising open access journal that publishes extensively on this topic. This finding can potentially facilitate scholars in selecting appropriate journals for their research outputs.

Limitation and future research

Influenced by some objective factors, this study also has some limitations. First of all, the bibliometrics analysis software has high standards for data. In order to ensure the quality and integrity of the collected data, the research only selects the periodical papers in SCIE and SSCI indexes, which are the core collection of Web of Science database, and excludes other databases, conference papers, editorials and other publications, which may ignore some scientific research and original opinions in the field of digital technology education and application research. In addition, although this study used professional software to carry out bibliometric analysis and obtained more objective quantitative data, the analysis and interpretation of data will inevitably have a certain subjective color, and the influence of subjectivity on data analysis cannot be completely avoided. As such, future research endeavors will broaden the scope of literature screening and proactively engage scholars in the field to gain objective and state-of-the-art insights, while minimizing the adverse impact of personal subjectivity on research analysis.

Data availability

The datasets analyzed during the current study are available in the Dataverse repository: https://doi.org/10.7910/DVN/F9QMHY

Alabdulaziz MS (2021) COVID-19 and the use of digital technology in mathematics education. Educ Inf Technol 26(6):7609–7633. https://doi.org/10.1007/s10639-021-10602-3

Arif TB, Munaf U, Ul-Haque I (2023) The future of medical education and research: is ChatGPT a blessing or blight in disguise? Med Educ Online 28. https://doi.org/10.1080/10872981.2023.2181052

Banerjee M, Chiew D, Patel KT, Johns I, Chappell D, Linton N, Cole GD, Francis DP, Szram J, Ross J, Zaman S (2021) The impact of artificial intelligence on clinical education: perceptions of postgraduate trainee doctors in London (UK) and recommendations for trainers. BMC Med Educ 21. https://doi.org/10.1186/s12909-021-02870-x

Barlovits S, Caldeira A, Fesakis G, Jablonski S, Koutsomanoli Filippaki D, Lázaro C, Ludwig M, Mammana MF, Moura A, Oehler DXK, Recio T, Taranto E, Volika S(2022) Adaptive, synchronous, and mobile online education: developing the ASYMPTOTE learning environment. Mathematics 10:1628. https://doi.org/10.3390/math10101628

Article   Google Scholar  

Baron NS(2021) Know what? How digital technologies undermine learning and remembering J Pragmat 175:27–37. https://doi.org/10.1016/j.pragma.2021.01.011

Batista J, Morais NS, Ramos F (2016) Researching the use of communication technologies in higher education institutions in Portugal. https://doi.org/10.4018/978-1-5225-0571-6.ch057

Beardsley M, Albó L, Aragón P, Hernández-Leo D (2021) Emergency education effects on teacher abilities and motivation to use digital technologies. Br J Educ Technol 52. https://doi.org/10.1111/bjet.13101

Bennett S, Maton K(2010) Beyond the “digital natives” debate: towards a more nuanced understanding of students’ technology experiences J Comput Assist Learn 26:321–331. https://doi.org/10.1111/j.1365-2729.2010.00360.x

Buckingham D, Burn A (2007) Game literacy in theory and practice 16:323–349

Google Scholar  

Bulfin S, Pangrazio L, Selwyn N (2014) Making “MOOCs”: the construction of a new digital higher education within news media discourse. In: The International Review of Research in Open and Distributed Learning 15. https://doi.org/10.19173/irrodl.v15i5.1856

Camilleri MA, Camilleri AC(2016) Digital learning resources and ubiquitous technologies in education Technol Knowl Learn 22:65–82. https://doi.org/10.1007/s10758-016-9287-7

Chen C(2006) CiteSpace II: detecting and visualizing emerging trends and transient patterns in scientific literature J Am Soc Inf Sci Technol 57:359–377. https://doi.org/10.1002/asi.20317

Chen J, Dai J, Zhu K, Xu L(2022) Effects of extended reality on language learning: a meta-analysis Front Psychol 13:1016519. https://doi.org/10.3389/fpsyg.2022.1016519

Article   PubMed   PubMed Central   Google Scholar  

Chen J, Wang CL, Tang Y (2022b) Knowledge mapping of volunteer motivation: a bibliometric analysis and cross-cultural comparative study. Front Psychol 13. https://doi.org/10.3389/fpsyg.2022.883150

Cohen A, Soffer T, Henderson M(2022) Students’ use of technology and their perceptions of its usefulness in higher education: International comparison J Comput Assist Learn 38(5):1321–1331. https://doi.org/10.1111/jcal.12678

Collins A, Halverson R(2010) The second educational revolution: rethinking education in the age of technology J Comput Assist Learn 26:18–27. https://doi.org/10.1111/j.1365-2729.2009.00339.x

Conole G, Alevizou P (2010) A literature review of the use of Web 2.0 tools in higher education. Walton Hall, Milton Keynes, UK: the Open University, retrieved 17 February

Creely E, Henriksen D, Crawford R, Henderson M(2021) Exploring creative risk-taking and productive failure in classroom practice. A case study of the perceived self-efficacy and agency of teachers at one school Think Ski Creat 42:100951. https://doi.org/10.1016/j.tsc.2021.100951

Davis N, Eickelmann B, Zaka P(2013) Restructuring of educational systems in the digital age from a co-evolutionary perspective J Comput Assist Learn 29:438–450. https://doi.org/10.1111/jcal.12032

De Belli N (2009) Bibliometrics and citation analysis: from the science citation index to cybermetrics, Scarecrow Press. https://doi.org/10.1111/jcal.12032

Domínguez A, Saenz-de-Navarrete J, de-Marcos L, Fernández-Sanz L, Pagés C, Martínez-Herráiz JJ(2013) Gamifying learning experiences: practical implications and outcomes Comput Educ 63:380–392. https://doi.org/10.1016/j.compedu.2012.12.020

Donnison S (2009) Discourses in conflict: the relationship between Gen Y pre-service teachers, digital technologies and lifelong learning. Australasian J Educ Technol 25. https://doi.org/10.14742/ajet.1138

Durfee SM, Jain S, Shaffer K (2003) Incorporating electronic media into medical student education. Acad Radiol 10:205–210. https://doi.org/10.1016/s1076-6332(03)80046-6

Dzikowski P(2018) A bibliometric analysis of born global firms J Bus Res 85:281–294. https://doi.org/10.1016/j.jbusres.2017.12.054

van Eck NJ, Waltman L(2009) Software survey: VOSviewer, a computer program for bibliometric mapping Scientometrics 84:523–538 https://doi.org/10.1007/s11192-009-0146-3

Edwards S(2013) Digital play in the early years: a contextual response to the problem of integrating technologies and play-based pedagogies in the early childhood curriculum Eur Early Child Educ Res J 21:199–212. https://doi.org/10.1080/1350293x.2013.789190

Edwards S(2015) New concepts of play and the problem of technology, digital media and popular-culture integration with play-based learning in early childhood education Technol Pedagogy Educ 25:513–532 https://doi.org/10.1080/1475939x.2015.1108929

Article   MathSciNet   Google Scholar  

Eisenberg MB(2008) Information literacy: essential skills for the information age DESIDOC J Libr Inf Technol 28:39–47. https://doi.org/10.14429/djlit.28.2.166

Forde C, OBrien A (2022) A literature review of barriers and opportunities presented by digitally enhanced practical skill teaching and learning in health science education. Med Educ Online 27. https://doi.org/10.1080/10872981.2022.2068210

García-Morales VJ, Garrido-Moreno A, Martín-Rojas R (2021) The transformation of higher education after the COVID disruption: emerging challenges in an online learning scenario. Front Psychol 12. https://doi.org/10.3389/fpsyg.2021.616059

Garfield E(2006) The history and meaning of the journal impact factor JAMA 295:90. https://doi.org/10.1001/jama.295.1.90

Article   PubMed   Google Scholar  

Garzón-Artacho E, Sola-Martínez T, Romero-Rodríguez JM, Gómez-García G(2021) Teachers’ perceptions of digital competence at the lifelong learning stage Heliyon 7:e07513. https://doi.org/10.1016/j.heliyon.2021.e07513

Gaviria-Marin M, Merigó JM, Baier-Fuentes H(2019) Knowledge management: a global examination based on bibliometric analysis Technol Forecast Soc Change 140:194–220. https://doi.org/10.1016/j.techfore.2018.07.006

Gilster P, Glister P (1997) Digital literacy. Wiley Computer Pub, New York

Greenhow C, Lewin C(2015) Social media and education: reconceptualizing the boundaries of formal and informal learning Learn Media Technol 41:6–30. https://doi.org/10.1080/17439884.2015.1064954

Hawkins DT(2001) Bibliometrics of electronic journals in information science Infor Res 7(1):7–1. http://informationr.net/ir/7-1/paper120.html

Henderson M, Selwyn N, Finger G, Aston R(2015) Students’ everyday engagement with digital technology in university: exploring patterns of use and “usefulness J High Educ Policy Manag 37:308–319 https://doi.org/10.1080/1360080x.2015.1034424

Huang CK, Neylon C, Hosking R, Montgomery L, Wilson KS, Ozaygen A, Brookes-Kenworthy C (2020) Evaluating the impact of open access policies on research institutions. eLife 9. https://doi.org/10.7554/elife.57067

Hwang GJ, Tsai CC(2011) Research trends in mobile and ubiquitous learning: a review of publications in selected journals from 2001 to 2010 Br J Educ Technol 42:E65–E70. https://doi.org/10.1111/j.1467-8535.2011.01183.x

Hwang GJ, Wu PH, Zhuang YY, Huang YM(2013) Effects of the inquiry-based mobile learning model on the cognitive load and learning achievement of students Interact Learn Environ 21:338–354. https://doi.org/10.1080/10494820.2011.575789

Jiang S, Ning CF (2022) Interactive communication in the process of physical education: are social media contributing to the improvement of physical training performance. Universal Access Inf Soc, 1–10. https://doi.org/10.1007/s10209-022-00911-w

Jing Y, Zhao L, Zhu KK, Wang H, Wang CL, Xia Q(2023) Research landscape of adaptive learning in education: a bibliometric study on research publications from 2000 to 2022 Sustainability 15:3115–3115. https://doi.org/10.3390/su15043115

Jing Y, Wang CL, Chen Y, Wang H, Yu T, Shadiev R (2023b) Bibliometric mapping techniques in educational technology research: a systematic literature review. Educ Inf Technol 1–29. https://doi.org/10.1007/s10639-023-12178-6

Krishnamurthy S (2020) The future of business education: a commentary in the shadow of the Covid-19 pandemic. J Bus Res. https://doi.org/10.1016/j.jbusres.2020.05.034

Kumar S, Lim WM, Pandey N, Christopher Westland J (2021) 20 years of electronic commerce research. Electron Commer Res 21:1–40

Kyza EA, Georgiou Y(2018) Scaffolding augmented reality inquiry learning: the design and investigation of the TraceReaders location-based, augmented reality platform Interact Learn Environ 27:211–225. https://doi.org/10.1080/10494820.2018.1458039

Laurillard D(2008) Technology enhanced learning as a tool for pedagogical innovation J Philos Educ 42:521–533. https://doi.org/10.1111/j.1467-9752.2008.00658.x

Li M, Yu Z (2023) A systematic review on the metaverse-based blended English learning. Front Psychol 13. https://doi.org/10.3389/fpsyg.2022.1087508

Luo H, Li G, Feng Q, Yang Y, Zuo M (2021) Virtual reality in K-12 and higher education: a systematic review of the literature from 2000 to 2019. J Comput Assist Learn. https://doi.org/10.1111/jcal.12538

Margaryan A, Littlejohn A, Vojt G(2011) Are digital natives a myth or reality? University students’ use of digital technologies Comput Educ 56:429–440. https://doi.org/10.1016/j.compedu.2010.09.004

McMillan S(1996) Literacy and computer literacy: definitions and comparisons Comput Educ 27:161–170. https://doi.org/10.1016/s0360-1315(96)00026-7

Mo CY, Wang CL, Dai J, Jin P (2022) Video playback speed influence on learning effect from the perspective of personalized adaptive learning: a study based on cognitive load theory. Front Psychology 13. https://doi.org/10.3389/fpsyg.2022.839982

Moorhouse BL (2021) Beginning teaching during COVID-19: newly qualified Hong Kong teachers’ preparedness for online teaching. Educ Stud 1–17. https://doi.org/10.1080/03055698.2021.1964939

Moorhouse BL, Wong KM (2021) The COVID-19 Pandemic as a catalyst for teacher pedagogical and technological innovation and development: teachers’ perspectives. Asia Pac J Educ 1–16. https://doi.org/10.1080/02188791.2021.1988511

Moskal P, Dziuban C, Hartman J (2013) Blended learning: a dangerous idea? Internet High Educ 18:15–23

Mughal MY, Andleeb N, Khurram AFA, Ali MY, Aslam MS, Saleem MN (2022) Perceptions of teaching-learning force about Metaverse for education: a qualitative study. J. Positive School Psychol 6:1738–1745

Mustapha I, Thuy Van N, Shahverdi M, Qureshi MI, Khan N (2021) Effectiveness of digital technology in education during COVID-19 pandemic. a bibliometric analysis. Int J Interact Mob Technol 15:136

Nagle J (2018) Twitter, cyber-violence, and the need for a critical social media literacy in teacher education: a review of the literature. Teach Teach Education 76:86–94

Nazare J, Woolf A, Sysoev I, Ballinger S, Saveski M, Walker M, Roy D (2022) Technology-assisted coaching can increase engagement with learning technology at home and caregivers’ awareness of it. Comput Educ 188:104565

Nguyen UP, Hallinger P (2020) Assessing the distinctive contributions of simulation & gaming to the literature, 1970-2019: a bibliometric review. Simul Gaming 104687812094156. https://doi.org/10.1177/1046878120941569

Nygren H, Nissinen K, Hämäläinen R, Wever B(2019) Lifelong learning: formal, non-formal and informal learning in the context of the use of problem-solving skills in technology-rich environments Br J Educ Technol 50:1759–1770. https://doi.org/10.1111/bjet.12807

Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, Moher D (2021) The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. Int J Surg 88:105906

Pan SL, Zhang S(2020) From fighting COVID-19 pandemic to tackling sustainable development goals: an opportunity for responsible information systems research Int J Inf Manage 55:102196. https://doi.org/10.1016/j.ijinfomgt.2020.102196

Pan X, Yan E, Cui M, Hua W(2018) Examining the usage, citation, and diffusion patterns of bibliometric mapping software: a comparative study of three tools J Informetr 12:481–493. https://doi.org/10.1016/j.joi.2018.03.005

Parris Z, Cale L, Harris J, Casey A (2022) Physical activity for health, covid-19 and social media: what, where and why?. Movimento, 28. https://doi.org/10.22456/1982-8918.122533

Pasquini LA, Evangelopoulos N (2016) Sociotechnical stewardship in higher education: a field study of social media policy documents. J Comput High Educ 29:218–239

Pérez-Sanagustín M, Hernández-Leo D, Santos P, Delgado Kloos C, Blat J(2014) Augmenting reality and formality of informal and non-formal settings to enhance blended learning IEEE Trans Learn Technol 7:118–131. https://doi.org/10.1109/TLT.2014.2312719

Pinto M, Leite C (2020) Digital technologies in support of students learning in Higher Education: literature review. Digital Education Review 343–360. https://doi.org/10.1344/der.2020.37.343-360

Pires F, Masanet MJ, Tomasena JM, Scolari CA(2022) Learning with YouTube: beyond formal and informal through new actors, strategies and affordances Convergence 28(3):838–853. https://doi.org/10.1177/1354856521102054

Pritchard A (1969) Statistical bibliography or bibliometrics 25:348

Romero M, Romeu T, Guitert M, Baztán P (2021) Digital transformation in higher education: the UOC case. In ICERI2021 Proceedings (pp. 6695–6703). IATED https://doi.org/10.21125/iceri.2021.1512

Romero-Hall E, Jaramillo Cherrez N (2022) Teaching in times of disruption: faculty digital literacy in higher education during the COVID-19 pandemic. Innovations in Education and Teaching International 1–11. https://doi.org/10.1080/14703297.2022.2030782

Rospigliosi PA(2023) Artificial intelligence in teaching and learning: what questions should we ask of ChatGPT? Interactive Learning Environments 31:1–3. https://doi.org/10.1080/10494820.2023.2180191

Salas-Pilco SZ, Yang Y, Zhang Z(2022) Student engagement in online learning in Latin American higher education during the COVID-19 pandemic: a systematic review. Br J Educ Technol 53(3):593–619. https://doi.org/10.1111/bjet.13190

Selwyn N(2009) The digital native-myth and reality In Aslib proceedings 61(4):364–379. https://doi.org/10.1108/00012530910973776

Selwyn N(2012) Making sense of young people, education and digital technology: the role of sociological theory Oxford Review of Education 38:81–96. https://doi.org/10.1080/03054985.2011.577949

Selwyn N, Facer K(2014) The sociology of education and digital technology: past, present and future Oxford Rev Educ 40:482–496. https://doi.org/10.1080/03054985.2014.933005

Selwyn N, Banaji S, Hadjithoma-Garstka C, Clark W(2011) Providing a platform for parents? Exploring the nature of parental engagement with school Learning Platforms J Comput Assist Learn 27:314–323. https://doi.org/10.1111/j.1365-2729.2011.00428.x

Selwyn N, Aagaard J (2020) Banning mobile phones from classrooms-an opportunity to advance understandings of technology addiction, distraction and cyberbullying. Br J Educ Technol 52. https://doi.org/10.1111/bjet.12943

Selwyn N, O’Neill C, Smith G, Andrejevic M, Gu X (2021) A necessary evil? The rise of online exam proctoring in Australian universities. Media Int Austr 1329878X2110058. https://doi.org/10.1177/1329878x211005862

Selwyn N, Pangrazio L, Nemorin S, Perrotta C (2019) What might the school of 2030 be like? An exercise in social science fiction. Learn, Media Technol 1–17. https://doi.org/10.1080/17439884.2020.1694944

Selwyn, N (2016) What works and why?* Understanding successful technology enabled learning within institutional contexts 2016 Final report Appendices (Part B). Monash University Griffith University

Sjöberg D, Holmgren R (2021) Informal workplace learning in swedish police education-a teacher perspective. Vocations and Learning. https://doi.org/10.1007/s12186-021-09267-3

Strotmann A, Zhao D (2012) Author name disambiguation: what difference does it make in author-based citation analysis? J Am Soc Inf Sci Technol 63:1820–1833

Article   CAS   Google Scholar  

Sutherland R, Facer K, Furlong R, Furlong J(2000) A new environment for education? The computer in the home. Comput Educ 34:195–212. https://doi.org/10.1016/s0360-1315(99)00045-7

Szeto E, Cheng AY-N, Hong J-C(2015) Learning with social media: how do preservice teachers integrate YouTube and Social Media in teaching? Asia-Pac Educ Res 25:35–44. https://doi.org/10.1007/s40299-015-0230-9

Tang E, Lam C(2014) Building an effective online learning community (OLC) in blog-based teaching portfolios Int High Educ 20:79–85. https://doi.org/10.1016/j.iheduc.2012.12.002

Taskin Z, Al U(2019) Natural language processing applications in library and information science Online Inf Rev 43:676–690. https://doi.org/10.1108/oir-07-2018-0217

Tegtmeyer K, Ibsen L, Goldstein B(2001) Computer-assisted learning in critical care: from ENIAC to HAL Crit Care Med 29:N177–N182. https://doi.org/10.1097/00003246-200108001-00006

Article   CAS   PubMed   Google Scholar  

Timotheou S, Miliou O, Dimitriadis Y, Sobrino SV, Giannoutsou N, Cachia R, Moné AM, Ioannou A(2023) Impacts of digital technologies on education and factors influencing schools' digital capacity and transformation: a literature review. Educ Inf Technol 28(6):6695–6726. https://doi.org/10.1007/s10639-022-11431-8

Trujillo Maza EM, Gómez Lozano MT, Cardozo Alarcón AC, Moreno Zuluaga L, Gamba Fadul M (2016) Blended learning supported by digital technology and competency-based medical education: a case study of the social medicine course at the Universidad de los Andes, Colombia. Int J Educ Technol High Educ 13. https://doi.org/10.1186/s41239-016-0027-9

Turin O, Friesem Y(2020) Is that media literacy?: Israeli and US media scholars’ perceptions of the field J Media Lit Educ 12:132–144. https://doi.org/10.1007/s11192-009-0146-3

Van Eck NJ, Waltman L (2019) VOSviewer manual. Universiteit Leiden

Vratulis V, Clarke T, Hoban G, Erickson G(2011) Additive and disruptive pedagogies: the use of slowmation as an example of digital technology implementation Teach Teach Educ 27:1179–1188. https://doi.org/10.1016/j.tate.2011.06.004

Wang CL, Dai J, Xu LJ (2022) Big data and data mining in education: a bibliometrics study from 2010 to 2022. In 2022 7th International Conference on Cloud Computing and Big Data Analytics ( ICCCBDA ) (pp. 507-512). IEEE. https://doi.org/10.1109/icccbda55098.2022.9778874

Wang CL, Dai J, Zhu KK, Yu T, Gu XQ (2023) Understanding the continuance intention of college students toward new E-learning spaces based on an integrated model of the TAM and TTF. Int J Hum-Comput Int 1–14. https://doi.org/10.1080/10447318.2023.2291609

Wong L-H, Boticki I, Sun J, Looi C-K(2011) Improving the scaffolds of a mobile-assisted Chinese character forming game via a design-based research cycle Comput Hum Behav 27:1783–1793. https://doi.org/10.1016/j.chb.2011.03.005

Wu R, Yu Z (2023) Do AI chatbots improve students learning outcomes? Evidence from a meta-analysis. Br J Educ Technol. https://doi.org/10.1111/bjet.13334

Yang D, Zhou J, Shi D, Pan Q, Wang D, Chen X, Liu J (2022) Research status, hotspots, and evolutionary trends of global digital education via knowledge graph analysis. Sustainability 14:15157–15157. https://doi.org/10.3390/su142215157

Yu T, Dai J, Wang CL (2023) Adoption of blended learning: Chinese university students’ perspectives. Humanit Soc Sci Commun 10:390. https://doi.org/10.3390/su142215157

Yu Z (2022) Sustaining student roles, digital literacy, learning achievements, and motivation in online learning environments during the COVID-19 pandemic. Sustainability 14:4388. https://doi.org/10.3390/su14084388

Za S, Spagnoletti P, North-Samardzic A(2014) Organisational learning as an emerging process: the generative role of digital tools in informal learning practices Br J Educ Technol 45:1023–1035. https://doi.org/10.1111/bjet.12211

Zhang X, Chen Y, Hu L, Wang Y (2022) The metaverse in education: definition, framework, features, potential applications, challenges, and future research topics. Front Psychol 13:1016300. https://doi.org/10.3389/fpsyg.2022.1016300

Zhou M, Dzingirai C, Hove K, Chitata T, Mugandani R (2022) Adoption, use and enhancement of virtual learning during COVID-19. Education and Information Technologies. https://doi.org/10.1007/s10639-022-10985-x

Download references

Acknowledgements

This research was supported by the Zhejiang Provincial Social Science Planning Project, “Mechanisms and Pathways for Empowering Classroom Teaching through Learning Spaces under the Strategy of High-Quality Education Development”, the 2022 National Social Science Foundation Education Youth Project “Research on the Strategy of Creating Learning Space Value and Empowering Classroom Teaching under the background of ‘Double Reduction’” (Grant No. CCA220319) and the National College Student Innovation and Entrepreneurship Training Program of China (Grant No. 202310337023).

Author information

Authors and affiliations.

College of Educational Science and Technology, Zhejiang University of Technology, Zhejiang, China

Chengliang Wang, Xiaojiao Chen, Yidan Liu & Yuhui Jing

Graduate School of Business, Universiti Sains Malaysia, Minden, Malaysia

Department of Management, The Chinese University of Hong Kong, Hong Kong, China

College of Humanities and Social Sciences, Beihang University, Beijing, China

You can also search for this author in PubMed   Google Scholar

Contributions

Conceptualization: Y.J., C.W.; methodology, C.W.; software, C.W., Y.L.; writing-original draft preparation, C.W., Y.L.; writing-review and editing, T.Y., Y.L., C.W.; supervision, X.C., T.Y.; project administration, Y.J.; funding acquisition, X.C., Y.L. All authors read and approved the final manuscript. All authors have read and approved the re-submission of the manuscript.

Corresponding author

Correspondence to Yuhui Jing .

Ethics declarations

Ethical approval.

Ethical approval was not required as the study did not involve human participants.

Informed consent

Informed consent was not required as the study did not involve human participants.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Wang, C., Chen, X., Yu, T. et al. Education reform and change driven by digital technology: a bibliometric study from a global perspective. Humanit Soc Sci Commun 11 , 256 (2024). https://doi.org/10.1057/s41599-024-02717-y

Download citation

Received : 11 July 2023

Accepted : 17 January 2024

Published : 12 February 2024

DOI : https://doi.org/10.1057/s41599-024-02717-y

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

articles about educational tools

The Pros and Cons of 7 Digital Teaching Tools

Explore more.

  • Classroom Management
  • Course Materials
  • Digital Learning

O ne highlight of the last 18 months has been the level of experimentation I’ve seen among educators. They’ve explored new ways to teach in different environments and new technologies to keep students involved and engaged. As we move forward, it’s important that we learn from all this experimentation so we may deliver a learning experience that’s better than what we entered the pandemic delivering.

Simply rejecting all digital modes of teaching once you’re back in a physical classroom is not in your students’—or your own—best interest. There are many benefits to virtual learning that are worth keeping. In my own teaching, I’ve tried to incorporate the best learnings of pandemic teaching by using a four-step framework —struggle, structure, systemize, and synthesize—alongside different digital teaching methods and technologies that I’ve found work for me.

The precise way you use each digital tool and the extent to which you combine digital instruction with in-person instruction will of course depend on the needs of each specific course you teach. But to help you start thinking about how digital tools can remain useful to you, here’s a summary of the advantages and disadvantages of seven of the most common ones. I also share when I use each one to help spur your thinking.

1. Recorded Lecture Videos

Recording yourself giving lectures is perhaps the simplest digital approach. While these video recordings are easy to create and effective for sharing information quickly, production value is often less than ideal and the videos can be less than engaging. Overall, this approach doesn’t nearly reach the full potential that can be accomplished with digital learning.

Advantages of recorded lectures:

They let students consume course material on their own schedule and at their own pace, which students like.

They are more accessible—you can speed them up or slow them down—and you can easily add additional accessibility features, such as automated closed captioning or transcriptions.

Students can fast forward through material they already understand and rewind or rewatch material they are struggling with, unlike in a live lecture when wandering attention can mean missing a crucial point.

Disadvantages of recorded lectures:

They can be less than engaging.

They’re not interactive.

When I use them:

Truthfully, I don’t use them very often. They can be useful for exceptional circumstances that make it impossible for everyone to be in a live lecture.

Occasionally, I use them to set up a mini-case to initiate problem-solving thinking, or to provide information about key framework ideas before class discussion, but I tend to use edited video lessons (see next section) for that purpose.

2. Edited Video Lessons

Webinar: designing better courses.

Robert D. Austin recently delivered an HBP Education webinar, entitled Designing Better Courses: Blending the Best of Pre- and Post-Pandemic Pedagogy , to discuss his course design process and detail how—and why—he mixes digital technologies both asynchronously and synchronously throughout his courses. Watch the full webinar recording here .

Advantages of video lessons:

All the advantages of recorded lectures (e.g., self-paced).

Students have an opportunity to watch several short videos in a row, as their schedules permit.

Graphics and other illustrations can be useful for clarifying concepts.

Disadvantages of video lessons:

They’re more engaging than recorded lectures, but still not interactive.

Producing these videos requires extra time and effort.

To set up initial problem situations or present useful framework materials.

To add new information that may cause students to reconsider previous conclusions.

To teach a mechanical analysis approach, such as how to calculate a net present value.

3. Zoom Sessions

When courses are held fully remotely or in a hybrid setting (with some students participating in person and some participating virtually), most class sessions and discussions happen over Zoom or a similar videoconferencing platform. These live virtual sessions can allow for a synchronous learning experience enhanced by other digital tools, such as whiteboards and other display technologies, but they cannot be considered an exact replacement for in-person discussions.

For those teaching fully in person, Zoom can still be used for things like bringing in guests from afar and for exercises that involve the use of groups in the form of breakout rooms. I run a negotiation exercise for one of my classes that is actually a lot easier to run in Zoom than in person, because it involves rapid transitions between breakout groups and larger class discussion. Zoom is also great for students to use in coordinating project work outside of class.

Advantages of Zoom sessions:

Students can synchronously interact with each other remotely.

Technology allows for unique modes of interaction and discussion, such as breakout rooms , which can be configured instantaneously, as well as chat channels.

It’s easy to invite remote guest speakers who would otherwise be unable to travel to campus.

Disadvantages of Zoom sessions:

Students and educators alike can experience Zoom fatigue.

It can be hard to read interpersonal cues from those who are remote.

While Zoom calls are interactive, they still lack valuable opportunities for casual social interaction.

There’s no real substitute for students walking in the hall together, chatting about pretty much anything. At least not yet.

For case discussions that include remote guests.

For exercises that need fast transitions in and out of groups.

For group-based project work.

4. Online Discussion Boards

Many instructors have tried to replace in-person discussions with asynchronous online discussion boards. In my experience, however, online discussion boards are best used in conjunction with synchronous discussion (via Zoom or in person). You can pick up points or concepts introduced in an online discussion and use them as jumping-off points for a synchronous discussion—giving credit to the students who raised them, of course. It’s a flow that I find leads to greater understanding of the material.

Advantages of online discussion boards:

They encourage student interaction.

Students can participate on their own time.

There’s generally no limit to the number of ideas students can contribute—meaning more students can participate in these discussions.

Shy students reluctant to engage in live sessions can build confidence with online contributions, especially if you pick up their points and credit them in synchronous discussions.

Disadvantages of online discussion boards:

Although instructors can drop comments and questions into online chat, it’s harder to actively guide and focus the discussions (because you’re not constantly there), so there’s no guarantee that students will arrive at the desired conclusions.

Multiple unrelated, branching discussions can occur at once, making things confusing or unfocused.

Students may not enjoy these types of discussions; they can feel forced or unnatural.

To start students thinking in a particular direction with the intention of bringing it all home in synchronous discussions.

To allow shy students opportunities to make contributions and gain confidence that may carry over into live sessions.

To surface ideas that I want to pick up on and add to in subsequent synchronous discussions.

5. Simulations

Simulations, like case studies, are a way to immerse students in a very specific experience—but with simulations, information is unfolding in real time. We can then ask students to do the work of extracting generalizable propositions, frameworks, theories, and so forth under our guidance.

Advantages of simulations:

They invite students to interact directly with the course material—and often each other—to solve the types of problems they may encounter in a real business environment.

Students have the opportunity to take direct control of their learning . They reach their own conclusions, then connect those learnings to framework material you present to rescue them from their struggle with it—to help them structure and systemize.

They have narrative elements and cause students to change their minds; students tend to remember lessons from simulations in much the same way they remember an impactful dramatic experience.

They give students experience in organizing and making meaning from information that arrives in real time and out of any helpful order.

Disadvantages of simulations:

They can take up a lot of time; in my view, the real learning from a simulation happens in a debrief and you need to take the time to distill out general lessons , especially when the models that underlie a simulation are complex.

Preparing a simulation for use can be effort intensive for instructors.

Very much in the same situations I use cases—when I want to present specific problems or situations from which I want students to derive general lessons.

To mix learning modes, as a break from and enhancement of cases.

Sometimes, in conjunction with cases, to show students that it can be harder than they think to “walk the talk”—to do what they said they would do in a case discussion when confronted with a problem unfolding in real time.

6. Multimedia Content

There’s also a lot of great multimedia content available—and this is yet another way to mix things up and shift modes to keep students interested. Using video elements in multimedia cases , for example, allows students see and hear case protagonists as opposed to just reading quotations.

Advantages of multimedia content:

Multimedia experiences offer a change of pace, and they’re often highly engaging.

Disadvantages of multimedia content:

They still don’t facilitate casual social interaction.

When I use it:

When I want to offer alternative modes for introducing problems or management situations, much like my use cases for simulations.

7. Curated Content

Many of us were using curated third-party content—anything from TED Talks to podcasts to YouTube tutorial videos—before the pandemic. But going virtual has prompted me to search around and use even more curated material. This kind of content can be used for a variety of desired outcomes: to help students explore case studies more deeply, for example, or to complete projects in virtual workspaces, such as Miro or Google Jamboard, for which students may need a how-to assist.

Advantages of curated content:

It’s often quite engaging, and much of it is very professionally done.

Once you have located good content, there is relatively little an instructor needs to do other than cue it up.

Disadvantages of curated content:

When you use too much of this type of content, students can think that you haven’t prepared for their specific needs.

Some content isn’t research based, or it can even put forth theoretical ideas that are unsupported or flawed. You must verify the quality of the content for yourself.

Pretty much anywhere—interwoven amid asynchronous edited video content or in synchronous classes, whether online or in person.

Pulling This All Together: An Example

The thought of putting all of these pieces—and there are a lot of them—together can feel like assembling a difficult puzzle. But by taking a fresh look at these technologies and thinking through how these use cases may support your course objectives, you can land on some really powerful learning experiences for your students.

Here is an example of how I tried to get the mix right for a course called Managing Innovation that I teach in Ivey’s Accelerated MBA program.

sample implementation image

Robert D. Austin, “ Designing Better Courses: Blending the Best of Pre- and Post-Pandemic Pedagogy ,” Harvard Business Publishing Education, July 21, 2021. Accessed September 8, 2021.

To step through this in more detail, watch the video below to hear me talking though this sample implementation.

The New Normal of Teaching Includes Digital Tools

No matter how enticing it may be to return to your previous “normal”—a normal in which perhaps you didn’t incorporate all that many technologies or tools in your teaching—there are many benefits to virtual learning that are worth keeping, from better accessibility for all students to more opportunities for experiential learning that sticks.

By carefully considering the pros and cons of each available technology, you can choose the digital tools that will best support your lesson plans, making each stage of your course as effective and memorable for your students as possible.

TELL US WHAT YOU THINK: Do you use other technologies in your online, hybrid, or in-person courses that aren’t on this list? We want to hear from you. Email us at [email protected] .

articles about educational tools

Robert D. Austin is a professor of information systems at Ivey Business School and an affiliated faculty member at Harvard Medical School. He has published widely, authoring nine books, more than 50 cases and notes, three Harvard online products, and two popular massive open online courses (MOOCs) running on the Coursera platform.

Related Articles

PERSPECTIVES

We use cookies to understand how you use our site and to improve your experience, including personalizing content. Learn More . By continuing to use our site, you accept our use of cookies and revised Privacy Policy .

articles about educational tools

  • Share full article

Advertisement

Supported by

How Technology Is Changing the Future of Higher Education

Labs test artificial intelligence, virtual reality and other innovations that could improve learning and lower costs for Generation Z and beyond.

articles about educational tools

By Jon Marcus

This article is part of our latest Learning special report . We’re focusing on Generation Z, which is facing challenges from changing curriculums and new technology to financial aid gaps and homelessness.

MANCHESTER, N.H. — Cruising to class in her driverless car, a student crams from notes projected on the inside of the windshield while she gestures with her hands to shape a 3-D holographic model of her architecture project.

It looks like science fiction, an impression reinforced by the fact that it is being demonstrated in virtual reality in an ultramodern space with overstuffed pillows for seats. But this scenario is based on technology already in development.

The setting is the Sandbox ColLABorative, the innovation arm of Southern New Hampshire University, on the fifth floor of a downtown building with panoramic views of the sprawling red brick mills that date from this city’s 19th-century industrial heyday.

It is one of a small but growing number of places where experts are testing new ideas that will shape the future of a college education, using everything from blockchain networks to computer simulations to artificial intelligence, or A.I.

Theirs is not a future of falling enrollment, financial challenges and closing campuses. It’s a brighter world in which students subscribe to rather than enroll in college, learn languages in virtual reality foreign streetscapes with avatars for conversation partners, have their questions answered day or night by A.I. teaching assistants and control their own digital transcripts that record every life achievement.

The possibilities for advances such as these are vast. The structure of higher education as it is still largely practiced in America is as old as those Manchester mills, based on a calendar that dates from a time when students had to go home to help with the harvest, and divided into academic disciplines on physical campuses for 18- to 24-year-olds.

Universities may be at the cutting edge of research into almost every other field, said Gordon Jones, founding dean of the Boise State University College of Innovation and Design. But when it comes to reconsidering the structure of their own, he said, “they’ve been very risk-averse.”

Now, however, squeezed by the demands of employers and students — especially the up and coming Generation Z — and the need to attract new customers, some schools, such as Boise State and Southern New Hampshire University, are starting labs to come up with improvements to help people learn more effectively, match their skills with jobs and lower their costs.

More than 200 have added senior executives whose titles include the words “digital” or “innovation,” the consulting firm Entangled Solutions found; many were recruited from the corporate and tech sectors. M.I.T. has set up a multimillion-dollar fund to pay for faculty to experiment with teaching innovations .

Some colleges and universities are collaborating on such ideas in groups including the University Innovation Alliance and the Marvel Universe-worthy HAIL Storm — it stands for Harvesting Academic Innovation for Learners — a coalition of academic innovation labs.

If history is a guide, the flashiest notions being developed in workshops in these places won’t get far. University campuses are like archaeological digs of innovations that didn’t fulfill their promises. Even though the biggest leap forward of the last few decades, for example — delivering courses online — appears to have lowered costs , the graduation rates of online higher education remain much lower than those of programs taught in person .

“One of the most important things we do here is disprove and dismantle ideas,” said William Zemp, chief strategy and innovation officer at Southern New Hampshire University.

“There’s so much white noise out there, you have to be sort of a myth buster.”

But some ambitious concepts are already being tested.

College by Subscription

One of these would transform the way students pay for higher education. Instead of enrolling, for example, they might subscribe to college; for a monthly fee, they could take whatever courses they want, when they want, with long-term access to advising and career help.

The Georgia Institute of Technology is one of the places mulling a subscription model, said Richard DeMillo, director of its Center for 21st Century Universities. It would include access to a worldwide network of mentors and advisers and “whatever someone needs to do to improve their professional situation or acquire a new skill or get feedback on how things are going.”

Boise State is already piloting this concept. Its Passport to Education costs $425 a month for six credit hours or $525 for nine in either of two online bachelor’s degree programs. That’s 30 percent cheaper than the in-state, in-person tuition.

Paying by the month encourages students to move faster through their educations, and most are projected to graduate in 18 months, Mr. Jones said. The subscription model has attracted 47 students so far, he said, with another 94 in the application process.

However they pay for it, future students could find other drastic changes in the way their educations are delivered.

Your Teacher Is a Robot

Georgia Tech has been experimenting with a virtual teaching assistant named Jill Watson, built on the Jeopardy-winning IBM Watson supercomputer platform. This A.I. answers questions in a discussion forum alongside human teaching assistants; students often can’t distinguish among them, their professor says. More Jill Watsons could help students get over hurdles they encounter in large or online courses. The university is working next on developing virtual tutors, which it says could be viable in two to five years .

S.N.H.U., in a collaboration with the education company Pearson, is testing A.I. grading. Barnes & Noble Education already has an A.I. writing tool called bartleby write , named for the clerk in the Herman Melville short story, that corrects grammar, punctuation and spelling, searches for plagiarism and helps create citations.

At Arizona State University, A.I. is being used to watch for signs that A.S.U. Online students might be struggling, and to alert their academic advisers.

“If we could catch early signals, we could go to them much earlier and say, ‘Hey you’re still in the window’ ” to pass, said Donna Kidwell, chief technology officer of the university’s digital teaching and learning lab, EdPlus.

Another harbinger of things to come sits on a hillside near the Hudson River in upstate New York, where an immersion lab with 15-foot walls and a 360-degree projection system transports Rensselaer Polytechnic Institute language students to China , virtually.

The students learn Mandarin Chinese by conversing with A.I. avatars that can recognize not only what they say but their gestures and expressions, all against a computer-generated backdrop of Chinese street markets, restaurants and other scenes.

Julian Wong, a mechanical engineering major in the first group of students to go through the program, “thought it would be cheesy.” In fact, he said, “It’s definitely more engaging, because you’re actively involved with what’s going on.”

Students in the immersion lab mastered Mandarin about twice as fast as their counterparts in conventional classrooms, said Shirley Ann Jackson, the president of Rensselaer.

Dr. Jackson, a physicist, was not surprised. The students enrolling in college now “grew up in a digital environment,” she said. “Why not use that to actually engage them?”

Slightly less sophisticated simulations are being used in schools of education, where trainee teachers practice coping with simulated schoolchildren. Engineering students at the University of Michigan use an augmented-reality track to test autonomous vehicles in simulated traffic.

A Transcript for Life

The way these kinds of learning get documented is also about to change. A race is underway to create a lifelong transcript.

Most academic transcripts omit work or military histories, internships, apprenticeships and other relevant experience. And course names such as Biology 301 or Business 102 reveal little about what students have actually learned.

“The learner, the learning provider and the employer all are speaking different languages that don’t interconnect,” said Michelle Weise, chief innovation officer at the Strada Institute for the Future of Work.

A proposed solution: the “interoperable learning record,” or I.L.R. (proof that, even in the future, higher education will be rife with acronyms and jargon).

The I.L.R. would list the specific skills that people have learned — customer service, say, or project management — as opposed to which courses they passed and majors they declared. And it would include other life experiences they accumulated.

This “digital trail” would remain in the learner’s control to share with prospective employers and make it easier for a student to transfer academic credits earned at one institution to another.

American universities, colleges and work force training programs are now awarding at least 738,428 unique credentials , according to a September analysis by a nonprofit organization called Credential Engine, which has taken on the task of translating these into a standardized registry of skills.

Unlike transcripts, I.L.R.s could work in two directions. Not only could prospective employees use them to look for jobs requiring the skills they have; employers could comb through them to find prospective hires with the skills they need.

“We’re trying to live inside this whole preindustrial design and figure out how we interface with technology to take it further,” said Dr. Kidwell of Arizona State. “Everybody is wrangling with trying to figure out which of these experiments are really going to work.”

This story was produced in collaboration with The Hechinger Report , a nonprofit, independent news organization focused on inequality and innovation in education.

Advertisement

Advertisement

Artificial intelligence (AI) learning tools in K-12 education: A scoping review

  • Open access
  • Published: 06 January 2024

Cite this article

You have full access to this open access article

  • Iris Heung Yue Yim   ORCID: orcid.org/0000-0002-5392-0092 1 &
  • Jiahong Su   ORCID: orcid.org/0000-0002-9681-7677 2  

5429 Accesses

8 Altmetric

Explore all metrics

Artificial intelligence (AI) literacy is a global strategic objective in education. However, little is known about how AI should be taught. In this paper, 46 studies in academic conferences and journals are reviewed to investigate pedagogical strategies, learning tools, assessment methods in AI literacy education in K-12 contexts, and students’ learning outcomes. The investigation reveals that the promotion of AI literacy education has seen significant progress in the past two decades. This highlights that intelligent agents, including Google’s Teachable Machine, Learning ML, and Machine Learning for Kids, are age-appropriate tools for AI literacy education in K-12 contexts. Kindergarten students can benefit from learning tools such as PopBots, while software devices, such as Scratch and Python, which help to develop the computational thinking of AI algorithms, can be introduced to both primary and secondary schools. The research shows that project-based, human–computer collaborative learning and play- and game-based approaches, with constructivist methodologies, have been applied frequently in AI literacy education. Cognitive, affective, and behavioral learning outcomes, course satisfaction and soft skills acquisition have been reported. The paper informs educators of appropriate learning tools, pedagogical strategies, assessment methodologies in AI literacy education, and students’ learning outcomes. Research implications and future research directions within the K-12 context are also discussed.

Similar content being viewed by others

articles about educational tools

Artificial Intelligence in K-12 Education: eliciting and reflecting on Swedish teachers' understanding of AI and its implications for teaching & learning

Johanna Velander, Mohammed Ahmed Taiye, … Marcelo Milrad

articles about educational tools

AI literacy in K-12: a systematic literature review

Lorena Casal-Otero, Alejandro Catala, … Senén Barro

articles about educational tools

Teaching Artificial Intelligence to K-12 Through a Role-Playing Game Questioning the Intelligence Concept

Julie Henry, Alyson Hernalesteen & Anne-Sophie Collard

Avoid common mistakes on your manuscript.

Introduction

Artificial intelligence (AI) was defined in 1956 as “the science and engineering of creating intelligent machines” (McCarthy, 2004 , p.2). AI education is considered a driver of economic growth, future workforce development, and global competitiveness (Cetindamar et al., 2022 ; Sestino & De Mauro, 2022 ). Researchers’ interest in equipping students with AI knowledge, skills, and attitudes to thrive in an AI-rich future (Miao et al., 2021 ; Rina et al., 2022 ; Wang & Cheng, 2021 ) has given rise to the term “AI literacy”, which concerns the design and implementation of AI learning activities, learning tools and applications, and pedagogical models. Some educators focus on demonstrating machine learning through activities for mastering coding skills and AI concepts (Marques et al., 2020 ), while others suggest focusing on computational thinking and engagement in deductive and logical reasoning practices (Wong, 2020 ). In this paper, it is argued that AI education should be extended beyond universities to K-12 students.

There have been a number of recent studies of AI in the context of kindergartens (Su & Yang, 2022 ; Williams et al., 2019a , 2019b ), primary schools (Ali et al., 2019 ; Shamir & Levin, 2021 ), and secondary schools (Norouzi et al., 2020 ; Yoder et al., 2020 ). However, little is known about what and how AI should be taught (Su et al., 2023a ; Ng et al., 2023 ; Van Brummelen et al., 2021 ). One challenge is delivering AI content in an age-appropriate and effective manner (Su et al., 2023b ; Su & Yang, 2023 ). Despite the numerous AI learning tools available in K-12 contexts (Rizvi et al., 2023 ; Van Brummelen et al., 2021 ), such as Turtle Robot (Papert & Solomon, 1971 ), PopBots (Williams et al., 2019a ) and LearningML applications (Rodríguez-García et al., 2020 ), many educators are concerned about the suitability of these tools (Chiu & Chai, 2020 ; Su & Yang, 2023 ).

With the development of age-appropriate learning tools, AI concepts can be simplified via visual representation, such as block-based programming (Estevez et al., 2019 ). For example, Scratch, a high-level block-based programming language, allows students with limited reading ability to create computer programs by using illustrations and visual elements (such as icons and shapes) without having to rely on traditional written instructions (Park & Shin, 2021 ). AI tools and platforms, including Zhorai (Lin et al., 2020 ), Learning ML (Rodríguez-Garciá et al., 2021 ), Machine Learning for Kids (Sabuncuoglu, 2020 ), and Scratch (Li & Song, 2019 ), have a positive impact on students’ AI knowledge and skills. Chen et al. ( 2020 ) noted that despite the introduction of various learning tools to teach AI, there has not been enough discussion on how AI content should be taught and how tools should be used to support pedagogical strategies and related educational outcomes.

Theoretical model

The technology-based learning model of Hsu et al. ( 2012 ) is adopted and modified in this study; it has been widely used by other researchers conducting similar systematic reviews (Chang et al., 2018 , 2022 ; Darmawansah et al., 2023 ; Tu & Hwang, 2020 ), as shown in Fig. 1 . Hsu et al. ( 2012 ) suggested cross-analyzing academic research trends by examining the associations among three categories: research methods, research issues, and application domains. They argue, for example, that by exploring how the topic of a study may affect the selection of its sample and participants, a more thorough and comprehensive analysis can be conducted. Their proposed technology-based learning model has helped frame the research questions of the present study.

figure 1

Modified technology-based learning model by the researchers of this review (adopted from Hsu et al., 2012 )

According to Hsu et al. ( 2012 ), “research methods”, “research issues”, and “application domains” are the three main categories to be considered in the development of a coding scheme to gauge research trends in the field of technology-based learning and education. In terms of research methods, a quantitative, qualitative, and mixed approach is employed in this study to construct the coding scheme for the review of the literature (McMillan & Schumacher, 2010 ). In terms of research issues, with reference to Chang et al. ( 2018 ), learning outcomes are categorized as cognitive, affective, behavioral, and skills acquisition outcomes. Finally, two application domains are pursued in this paper: (1) the pedagogical strategies commonly used in science courses, which were employed by Lai and Hwang ( 2015 ) and which include constructive, reflective, didactic, and unplugged pedagogies (Cope & Kalantzis, 2016 ), and (2) the learning tools, namely, hardware, software, intelligent agents, and unplugged strategies, which are coded as suggested by Ng and Chu ( 2021 ).

Research objectives

In this study, the literature on pedagogical strategies, assessment methods, learning tools, and learning outcomes in AI K-12 settings is studied. Four research questions are formulated.

RQ1: What are the potential learning tools identified in AI K-12 education?

Rq2: what pedagogical strategies are commonly proposed by studies on ai k-12 learning tools, rq3: what learning outcomes have been demonstrated in studies on ai k-12 learning tools.

RQ4: What are the research and assessment methods used in studies on AI K-12 learning tools?

This study follows the same four steps employed in other studies on AI literacy in K-12 (e.g., Ng et al., 2022 ; Su et al., 2022 ): (1) identifying relevant studies, (2) selecting and excluding eligible studies, (3) data analysis, and (4) reporting findings. In this study, the preferred reporting items for systematic reviews and meta-analyses (PRISMA) guidelines (Moher et al., 2015 ) are followed.

Identifying relevant studies

The electronic databases used for the literature search were ACM, EBSCO, Web of Science, and Scopus. The aim of this review is to provide a comprehensive K-12 education for learning tools, encompassing early childhood education and primary and secondary education. As the education systems of different countries may differ from each other, the search string used in the paper for K-12 includes from kindergarten to secondary school students. In addition, learning tools are defined as a variety of learning platforms and systems, educational applications and activities that can enhance the teaching process and support students in AI literacy learning. Therefore, the search strings are reflected with specific definitions for K-12 and learning tools to search for target articles and data, as shown in Table 1 .

Study selection and exclusion

To ensure the generalizability of the findings and to avoid biases in article selection, specific inclusion and exclusion criteria are employed in this study (Table  2 ).

As shown in Fig.  2 , a total of 326 articles were identified, 105 from EBSCO, 81 from Web of Science, 110 from Scopus, and 30 from ACM. The exclusion criteria were as follows: (1) studies that were irrelevant to the research topic (N = 251). For example, Bai and Yang ( 2019 ) were excluded since the research applied a deep learning technology recommendation system to improve teachers’ information technology ability. It was conducted in contexts other than those of AI literacy education, learning and instruction. Mahon et al. ( 2022 ) presented the design of an online machine learning and artificial intelligence course for secondary school students; however, they did not discuss in detail what type of learning tools can be used and how to support students’ AI literacy learning. A discussion paper by Karalekas et al. ( 2023 ), a theoretical paper by Leitner et al. ( 2023 ) and a scoping review by Marques et al. ( 2020 ) were also removed because they were not empirical studies, and they did not involve conducting any practical experiment. (2) Duplicate studies (N = 10), (3) studies that were not written in English (N = 4), (4) non research studies (N = 10), and other types of articles (N = 8). Finally, 46 studies were selected, as shown in Appendix 1 .

figure 2

PRISMA diagram of included articles in the scoping review

The snowball method

To enhance the systematic search for relevant literature, the snowballing method as outlined by Sayers ( 2008 ) was employed. This involved tracing references in previously selected articles. The focus was on the references cited in the earlier selected articles as discovered through Google Scholar. Utilizing the snowballing method led to the identification of three additional articles that met the eligibility criteria described above.

Overview of selected studies

Table 3 presents an overview of the 46 selected studies, including the type of articles, year of publication, and educational level.

Publication trends

Forty-six articles were identified: 28 conference papers and 18 journal articles. The first article was published in 1995, and 39 articles have been published in the past 5 years, with a peak in 2021 (Fig. 3 ).

figure 3

The trend of AI literacy education in K-12 contexts

Most research took place in the USA (N = 8), China (N = 7), Finland (N = 3), Hong Kong (N = 3), Israel (N = 3), Spain (N = 3), Australia (N = 2) and Japan (N=2). Others were conducted in Brazil, Denmark, Greece, Indonesia, New Zealand, Norway, Sweden, Thailand, and the UK. The locations of the remaining six articles are unknown.

Educational levels

Primary and secondary schools are both the most researched educational levels, each covering 44% of the selected articles, followed by kindergartens (11%) and K-12 education (2%).

These selected studies generally include samples of students of both genders and a wide range of ages, from 3-year-old kindergarten students (Vartiainen et al., 2020 ) to 20-year-old Danish high school students (Kaspersen et al., 2021 ). It also encompasses participants in science technology engineering mathematics (STEM) classes (Ho et al., 2019 ), high-performing students of the Scientists in School program (Heinze et al., 2010 ), students with and without an AI background (Yoder et al., 2020 ), and students from varying socioeconomic backgrounds (Kaspersen et al., 2021 ).

There were three AI-related research studies between 1995 and 2017, mostly adopting unplugged activities and games for AI teaching, which are different from research conducted after 2017. The first article was published by Scherz and Haberman ( 1995 ), who designed a special AI curriculum with the use of abstract data types and instructional models (e.g., graphs and decision trees) to teach AI concepts such as logic programming and AI systems to high school students in Israel. In another two studies, the use of programming robots (Heinze et al., 2010 ) and computer science unplugged activities (Lucas, 2009 ) were explored with Australian and New Zealand K-6 students, respectively. Since then, a greater variety of learning tools have been employed and expanded to European and Asian countries across all educational levels in K-12 settings. Appendix 1 provides an overview of the selected articles.

The potential learning tools identified in K-12 contexts were intelligent agents (N = 20), software-focused devices (N = 19), hardware-focused devices (N = 10), and unplugged activities (N = 6) (Fig. 4 and Table 4 ). In this section, intelligent agents, software devices, and hardware devices are discussed.

figure 4

Summary of learning tools used in AI K-12 education

Intelligent agents

Intelligent agents, such as Google Teachable Machine, Learning ML, and Machine Learning for Kids, which make decisions based on environmental inputs by using their sensors and actuators, are the most popular learning tools for enhancing students’ computational thinking skills within K-12 contexts. Teachable Machine is a web-based tool developed by Google and is found to be more effective than are unplugged activities in kindergarten settings (Lucas, 2009 ; Vartiainen et al., 2020 ). In Vartiainen et al. ( 2020 ), children aged between 3 and 9 autonomously explored the input‒output relationship with Google Teachable Machine, which fostered their intellectual curiosity, developed their computational thinking, and enhanced their understanding of machine learning. In both primary (Toivonen et al., 2020 ; Melsión et al., 2021 ) and secondary schools (Kilhoffer et al., 2023 ; Martins et al., 2023 ), Google Teachable Machine has been employed, allowing students to use their webcams, images, or sounds without coding to develop their own machine learning classification models.

In addition, Learning ML has been employed for primary schools to create AI-driven solutions and models, for example, to teach the supervised machine learning principle (Voulgari et al., 2021 ; Rodríguez-Garciá et al., 2021 ), which simplifies abstract AI algorithms for primary school students. Machine Learning for Kids, which introduces the power of the IBM Watson engine for AI modelling (Fernández-Martínez et al., 2021 ), Cognimates (Sabuncuoglu, 2020 ; Fernández-Martínez et al., 2021 ), which allows students to practice coding, and Ecraft2Learn, which contains a deep learning functionality (Kahn et al., 2018 ), have also been used in secondary school classrooms. Intelligent agents often offer students hands-on experience to develop datasets and to build customized machine learning systems.

Software devices

Software devices are adopted to enable mostly primary and secondary school students to learn about computational thinking, including programming for sequences, rule-based and conditional mechanisms, as well as data science and machine learning using visual language. For example, Scratch, a block-based programming software, is frequently used in both primary (Dai et al., 2023 ; Li & Song, 2019 ; Shamir & Levin, 2021 ) and secondary schools (Estevez et al., 2019 ; Fernández-Martínez et al., 2021 ). Other software is used for visualizing and scaffolding abstract AI concepts through online games and experiences, such as Quick and Draw (Martins et al., 2023 ) and Music Box (Han et al., 2018 ). In primary schools, Kitten is used to teach block-based programming (Li & Song, 2019 ), whereas C++ and JavaScript are used for logical thinking and simulation (Gong et al., 2020 ). In secondary schools, researchers have often employed free online software and tools, such as Snap (Yoder et al., 2020 ) and Python (Gong et al., 2018 ; Norouzi et al., 2020 ), for algorithm automation, as well as RapidMiner for no-code data science learning (Sakulkueakulsuk et al., 2018 ). To introduce machine learning concepts to secondary school students, other researchers have focused on developing online games such as the Rock Paper Game (Kajiwara et al., 2023 ) and the 3D role-player video game Quest (Priya et al., 2022 ).

Hardware devices

In addition, hardware, such as robotics and physical artifacts, has also been used with built-in software to supplement students’ understanding of AI concepts. Williams et al. ( 2019a , 2019b ) introduced a preschool originated programming platform consisting of a social robot (PopBot) and a block-based programming interface. In Williams et al. ( 2019a ), 80 prekindergartens to second-grade children (aged four to seven) were asked to build their own LEGO robot characters by using DUPLO block programming. PopBot is used as a learning companion to demonstrate its human-like behavior and to demystify AI concepts to younger students.

The lawn bowling robot (Ho et al., 2019 ), Zhorai conversational robot (Lin et al., 2020 ), Micro: Bits (Lin et al., 2021 ), and Plush toys (Tseng et al., 2021 ) have been used in primary schools, while CUHKiCar (Chiu et al., 2021 ), the Alpha robot dog (Chai et al., 2020 ), Raspberry Pi Raspbian and a four-wheel drive chassis (Gong et al., 2018 ) have been used in secondary schools. For example, in Ho et al. ( 2019 ), grade six students built lawn-bowling robots for games and competitions while learning about the binary search and optimization algorithms of machine learning. Chiu et al. ( 2021 ) introduced the robotic CUHKiCar to secondary school students so that they could perform face-tracking and line following tasks.

As shown in Fig. 5 , the four orientations of pedagogy are summarized as authentic/constructive, reflective, didactic, and unplugged. While a total of 17 potential pedagogical strategies were identified within the four orientations in K-12 contexts (Table 5 ), authentic/constructive methodologies with project-based learning (N = 27) were the most popular pedagogy used across kindergartens (Williams et al., 2019a , 2019b ), primary schools (Toivonen et al., 2020 ; Rodríguez-Garciá et al., 2021 ), and secondary schools (Gong et al., 2018 ; Kilhoffer et al., 2023 ; Sakulkueakulsuk et al., 2018 ). When teaching AI to students with a diverse range of needs, the evidence demonstrates the positive impact of combining multiple pedagogical approaches in K-12 studies (Heinze et al., 2010 ; Lee et al., 2021 ; Williams et al., 2019a , 2019b ).

figure 5

Four orientations of pedagogical strategies commonly used in AI K-12 education

First, authentic and constructive methodologies, project-based (N = 27), human-computer interaction (N = 7), and play-based active learning (N = 5) approaches have been commonly used in K-12 education. Offering hands-on opportunities to students to learn about real-world applications of AI is an example of project-based learning (Fernández-Martínez et al., 2021 ; Han et al., 2018 ; Williams et al., 2019a ). Other researchers have examined whether students can acquire AI knowledge on human-computer interactive experiences and have found that this does not require any prior knowledge of AI models, such as Zohari (Melsión et al., 2021 ) and Google Teachable Machine (Lin et al., 2020 ; Vartiainen et al., 2020 ). In addition, child-centered play-based learning can effectively engage students and encourage them to take the initiative to construct knowledge during the process of imaginative play (Heinze et al., 2010 ), which involves students adopting the role of AI developer, tester, and AI robot (Henry et al., 2021 ).

Pedagogical strategies in kindergartens

Researchers have often used project-based approaches (N = 3), human-computer interactions (N = 3), play-based learning (N = 1), and unplugged activities (N = 1) to teach younger students AI concepts. In a project-based learning approach, students learn by actively engaging in real-world projects. Williams et al. ( 2019a , 2019b ) used a hands-on project allowing prekindergarten and kindergarten students to acquire AI concepts, including knowledge-based systems, supervised machine learning, and AI generative music. Alternatively, Vartiainen et al. ( 2020 ) studied human-computer interactions that allowed students to freely explore the input‒output relationship with Google Teachable Machine to identify and to evaluate a problem and find a solution to it. Heinze et al. ( 2010 ) focused on imaginative play, which is relevant to young students, as play is associated with various levels of autonomy and provides an engaging introduction to AI and the formation of scientific concepts. Lucas ( 2009 ) used unplugged activities to teach the key concepts of computing, including data encoding, data compression, and error detection.

Pedagogical strategies in primary schools

Project-based learning is more frequently used in primary schools than in kindergartens: It has been reported as a learning approach in 14 of the 18 studies of primary school settings, compared to only three of the five studies in the kindergarten setting. Similarly, in primary school settings, studies have revealed a strong dependence on play/game-based (N = 5) and human-computer interaction learning approaches (N = 3).

Projects that demonstrate students’ improved AI knowledge have been conducted. Machine learning projects (Toivonen et al., 2020 ), LearningML projects (Rodríguez-Garciá et al., 2021 ), and “AI+” projects (Han et al., 2018 ) have been designed to demystify AI knowledge. Henry et al. ( 2021 ) integrated machine learning in role-playing games, while Shamir and Levin ( 2021 ) allowed students to play with AI chatbots to develop AI models and to construct a rule-based machine-learning system. Some researchers have designed learning programs that offer human-computer interaction activities to educate students about gender bias (Melsión et al., 2021 ) and the social impact of mistakes made by AI models in training datasets (Lin et al., 2020 ).

Pedagogical strategies in secondary schools

The project-based learning approach (N = 10) is also the most dominant in secondary schools, followed by collaborative learning (N = 5). First, project-based learning is used to engage students by applying their AI knowledge to solve real-world problems. Teachers have reported that AI projects and hands-on activities are effective in keeping students focused on tasks (Kilhoffer et al., 2023 ). For example, a smart car-themed AI project (Gong et al., 2018 ), the Redesign YouTube project (Fernández-Martínez et al., 2021 ), and the agriculture-based AI Challenge project (Sakulkueakulsuk et al., 2018 ) have been introduced to provide hands-on experience for students to connect their knowledge to their day-to-day lives. Through active exploration, such projects prompt secondary school students to contemplate the personal, social, economic, and ethical consequences of AI technologies (Kaspersen et al., 2021 ).

Second, collaborative learning allows students to work in groups to promote cognitive knowledge, as it engages them in scientific inquiry with the help of smart devices (Wan et al., 2020 ). Kaspersen et al. ( 2021 ) designed a collaborative learning tool, VotestratesML, together with a voting project allowing students to build machine learning models based on real-world voting data to predict results.

Of the 46 articles, 31 reported potential learning outcomes: (1) cognitive outcomes, (2) affective and behavioral outcomes, and (3) the level of course satisfaction and soft skills acquisition.

Cognitive outcomes

Thirty-one studies documented various degrees of positive cognitive outcomes. Students generally showed a basic understanding of AI, including AI rule-based systems (Ho et al., 2019 ), machine learning principles and applications (Han et al., 2018 ; Shamir & Levin, 2021 ), AI ethics (Melsión et al., 2021 ), and AI limitations (Lin et al., 2020 ). In Williams et al. ( 2019a ), 70% of prekindergarten and kindergarten students understood knowledge-based systems, whereas Vartiainen et al. ( 2020 ) found that, through AI learning tools, younger students developed their computational thinking and their understanding of machine-learning principles and applications. Then, Dai et al. ( 2023 ) reported that primary school students taught with analogy-based pedagogy (i.e., using humans as a reference to teach and learn AI) significantly outperformed primary school students taught with the conventional direct instructional approach in terms of developing their conceptual understanding and increasing their AI technical knowledge proficiency as well as their ethical awareness of AI. Other researchers have argued that primary school students have demonstrated their understanding of AI by constructing and applying machine-learning algorithms with the help of digital role-playing games (Voulgari et al., 2021 ) and project-based pedagogy (Shamir & Levin, 2021 ). Through designing and programming a robot, students increased their understanding of AI biases (Melsión et al., 2021 ). In secondary schools, researchers have also reported an increase in students’ knowledge of AI algorithms (Yoder et al., 2020 ) and machine learning concepts (Sakulkueakulsuk et al., 2018 ), as well as their recognition of AI patterns (Wan et al., 2020 ). For example, students understood the fundamental neural networks of machine learning concepts by developing a classification model of recycling images (Martins et al., 2023 ).

Affective and behavioral outcomes

Affective and behavioral outcomes have been identified in AI learning tool studies within K-12 contexts. In general, students’ motivation to learn AI (Han et al., 2018 ; Shamir & Levin, 2021 , 2022 ) and their interest in the course (Mariescu-Istodor & Jormanainen, 2019 ; Martins et al., 2023 ) were enhanced as a result of AI learning activities. Students’ perceptions of the relevance of AI to their life also increased (Kajiwara et al., 2023 ; Lin et al., 2021 ). Students scored high on self-efficacy (Kajiwara et al., 2023 ; Shamir & Levin, 2022 ) and confidence (Shamir & Levin, 2021 ) in training and validating an AI system. In Martins et al. ( 2023 ), over 45% of 108 secondary school student participants in the introductory course “Machine Learning for all” reported that they perceived AI learning as an enjoyable experience, and 63% of them hoped to learn more about machine learning in the future.

Moreover, students reported that they were highly motivated to explore the Teachable Machine (Vartiainen et al., 2020 ), to design the robotic arm and computer source codes (Ho et al., 2019 ), to draw animals and sea creatures for the machine learning project (Mariescu-Istodor & Jormanainen, 2019 ), and to predict the sweetness of mangoes by using machine learning models (Sakulkueakulsuk et al., 2018 ).

From the behavioral perspective, high student engagement was reported in project-based (Kaspersen et al., 2021 ; Shamir & Levin, 2021 ; Wan et al., 2020 ) and play/game-based (Heinze et al., 2010 ; Voulgari et al., 2021 ) settings. Primary students attended all sessions and expressed a desire to join an upcoming AI contingency course (Shamir & Levin, 2021 ), while secondary students were actively engaged in scientific inquiry (Wan et al., 2020 ). Students were also keen on recommending AI games to their friends (Voulgari et al., 2021 ). Therefore, a combination of play/game-based and project-based approaches may consolidate AI concepts through gameplay while enhancing students’ engagement in AI projects (Han et al., 2018 ).

Level of satisfaction and soft skills acquisition

Students’ level of satisfaction was found to be positively influenced by constructivist (e.g., project-based) and reflective (e.g., learning by design and learning by teaching) pedagogies (Ho et al., 2019 ; Shamir & Levin, 2021 , 2022 ). In Lin et al. ( 2020 ), students reported a high satisfaction level upon acquiring AI knowledge. Their computational thinking and subsequent project performance were also enhanced. All students completed the course and their AI tasks without any previous learning experience (Toivonen et al., 2020 ).

The findings from the selected articles reveal that a deep understanding of AI promotes the acquisition of various soft skills. Ali et al. ( 2019 ) found that students’ intellectual curiosity increased after engaging in the construction of an AI neuron. By using bulletin boards shared electronically and online chats for feedback, their collaboration and communication skills were also enhanced (Shamir & Levin, 2021 ). Moreover, students reported gaining problem solving and technical skills when working with AI systems, including coding, designing simple algorithms, and debugging in Scratch learning activities (Dai et al., 2023 ).

RQ4: What were the research and assessment methods used in AI K-12 learning tools studies?

In this section, an overview is presented of research methods and data collection procedures within K-12 contexts. Overall, researchers adopted a mixed method (N = 19), qualitative (N = 15) and quantitative methods (N = 12) in AI learning tools in K-12 research. Mixed methods are predominantly used in both primary school (e.g., Dai et al., 2023 ; Martins et al., 2023 ; Shamir & Levin, 2021 ; Toivonen et al., 2020 ) and secondary school contexts (e.g., Chiu et al., 2021 ; Estevez et al., 2019 ), whereas qualitative methods are commonly used in kindergartens (e.g., Heinze et al., 2010 ; Vartiainen et al., 2020 ), as shown in Table 6 .

A variety of assessment methods were used: questionnaires and surveys (N = 30), artifacts/performance-based evaluation (N = 15), interviews (N = 14), observations (N = 5), games assessment (N = 1), and field visits (N = 1) (Table 7 ). The two most commonly used data collection methods - questionnaires and surveys and artifacts/performance-based evaluation - are discussed in this section.

In terms of assessment methods, questionnaires and surveys (N = 30) and artifacts/performance-based evaluation (N = 17) are the two most commonly used data collection methods across K-12 contexts (Table  7 ).

Questionnaires and surveys are used in a quantitative methodology to understand the perception of robotics and theory of mind (e.g., knowledge access, content false belief and explicit false belief). For example, perception of robotics and theory of mind were used in kindergartens (Williams et al., 2019a , 2019b ).

Surveys were used to evaluate primary school students’ motivation (Lin et al., 2021 ), self-efficacy in AI learning (Shamir & Levin, 2022 ), and perceived knowledge and competence (Dai et al., 2023 ; Mariescu-Istodor & Jormanainen, 2019 ; Ng et al., 2022 ). In addition to Ali et al. ( 2019 ), who used the Torrance test for assessment, researchers also utilized pre- and posttests (Tseng et al., 2021 ) to compare the AI learning outcomes of control and treatment groups in primary school settings (Melsión et al., 2021 ). Others provided AI educational experience without stating the assessment method (Ho et al., 2019 ; Lee et al., 2020 ; Tseng et al., 2021 ). Heinze et al. ( 2010 ) conducted AI learning activities without assessing learning outcomes. Shamir and Levin ( 2022 ) designed a questionnaire based on “constructionist validated robotics learning” for machine learning construction (the questionnaire included statements such as " I can make a ML system ", " I can propose ideas for using ML to solve problems ."). Dai et al. ( 2023 ) used multiple choice questions (e.g., " Which of the following devices or systems is an intelligent agent? ") to evaluate the AI knowledge of primary school students according to Bloom’s Taxonomy.

In secondary schools, surveys are used to measure students’ information knowledge acquisition (Priya et al., 2022 ), perceived abilities (Chiu et al., 2021 ; Ng & Chu, 2021 ) and futuristic thinking, engagement, interactivity, and interdisciplinary thinking skills (Sakulkueakulsuk et al., 2018 ). For example, in Priya et al. ( 2022 ), surveys were used in the first phase of their study to test the knowledge gained by students in three AI areas, namely, supervised learning (e.g., " What is the underlying idea behind supervised learning ?"), gradient descent (e.g., " In gradient descent how do we reach optimum point? "), and KNN classifications (e.g., " Using underlying principle of KNN classification classify a fruit which is surrounded by 2 apples and 1 mango in its nearest neighbors. "). In the second phase of the study, surveys were used to evaluate students’ satisfaction with the design of the game “ML Quest”, which introduced machine learning concepts based on the quality factors of the technological acceptance model (e.g., “Visualizations displayed by ML-Game are relevant to the concept taught at each level” ).

Artifact-based/performance-based assessments are embedded in a large number of studies to evaluate learning outcomes. Through artifacts (e.g., Popbots), Williams et al., 2019a , 2019b ) evaluated kindergarteners’ knowledge and understanding of supervised machine learning. Ho et al. ( 2019 ) used a performance-based assessment to assess primary students’ understanding of optimal data training and its AI applications. The artifact analysis of Shamir and Levin ( 2021 ) involved the construction of a rule-based AI system, which included designing, understanding, and creating the AI neural network agent. Dai et al. ( 2023 ) used a drawing assessment to evaluate primary school students’ understanding of AI and its impact on their cognitive development using prompt questions (e.g., " What AI can do? What would you like to use AI for? ") to stimulate their thinking.

Moreover, Yoder et al. ( 2020 ) focused on secondary school students’ block-based programming artifacts to examine their knowledge of AI search algorithms and breadth-first search (BFS), as well as their understanding of the possibility of gender bias when using AI screening tools in recruitment. In Martins et al. ( 2023 ), machine learning model artifacts created by students were used as evidence to demonstrate their learning outcomes. The performance-based assessment was used to evaluate students’ ability to correctly label the recycling trash images in the classification process.

Discussion and conclusion

The results of this study are consistent with Kandlhofer et al. ( 2016 ), who found that a variety of learning tools have been designed to support various learning objectives for students from kindergarten to university. The previous literature also indicates that many learning tools, such as intelligent agents and software, are effective in facilitating adolescents’ and university students’ acquisition of computational thinking skills (Çakiroğlu et al., 2018 ; Van Brummelen et al., 2021 ), whereas the availability of such tools for kindergarten and primary students is often overlooked. Few researchers have investigated whether AI learning tools can bridge the learning gap of younger students (Zhou et al., 2020 ). This study revealed that without prior programming experience, these learning tools (such as Popbots, Teachable Machine, and Scratch) can help address the diverse needs of younger students across K-12 educational levels (Resnick et al., 2005 ), leading to a richer visual learning experience and improving instructional quality (Kaspersen et al., 2021 ; Long & Magerko, 2020 ).

Previous reviews have indicated that many pedagogies are suitable in AI education, although this was done without reference to students' learning outcomes (Sanui & Oyelere, 2020 ). The findings of this study enrich existing knowledge of the positive effects of authentic and constructivist pedagogies in affective, behavioral, and cognitive aspects, as well as students’ level of satisfaction in AI learning. This study reveals that multiple pedagogies, such as project-based learning, experiential learning, game-based learning, collaborative learning, and human–computer interaction, are widely used in K-12 educational settings. An emerging form of analogy-based pedagogy to evaluate the AI knowledge of primary school students by assessing their drawings is identified. The focus of this analogy-based pedagogical strategy is the comparison of humans and AI, where humans are gradually moved from an analogy and to a contrast to highlight the characteristics, mechanism, and learning procedures of AI. It demonstrates and reflects the dialogic quality of the relationship with shared enquiry and shared thinking among students and AI learning tools. This is significant given the new cognitive demand of the AI era, as it provokes a shift in the role of the students by thinking together and learning to learn together (Wegerif, 2011 ). In future studies, exploration of additional emerging pedagogies (Yim, 2023 ), the co-creation of arts-based possibility spaces (Burnard et al., 2022 ), and dialogic learning spaces (Wegerif, 2007 ) in AI literacy education can be considered.

In addition, educational tools and applications are used not only to contribute new ways of knowing and doing but also to embed learning tools at the center of the AI literacy activities and programs instead of playing a supporting role in the primary purpose of education. This is expanding to serve the human need for education. The use of multiple educational learning tools and pedagogical strategies may be influenced by various factors in the teaching process, including students’ gender, background knowledge, and educational setting, all of which may affect their learning styles and motivation to learn AI. These factors and issues can be explored in future studies.

In this review, it was found that some studies assessed students’ performance by using the Torrance test for creativity (Ali et al., 2019 ), an AI knowledge test (Ng et al., 2022 ; Wan et al., 2020 ), pre- and postsurveys (Chiu et al., 2021 ; Estevez et al., 2019 ), and comparisons between control and treatment groups (Dai et al., 2023 ; Melsión et al., 2021 ), while others used subjective measures, including self-report surveys. Although artifact-based and performance-based approaches have been increasingly adopted in data collection procedures, some researchers used them as evidence of learning, without scoring according to established marking criteria for assessment purposes. There is room for introducing objective and rubric-based evaluation mechanisms to assess the quality of suggested methodologies. However, the lack of agreement on assessment criteria and instructional feedback shows that further research is needed to support the wide application of AI teaching in K-12 classrooms.

Research implications

From this study, the use of intelligent agents is recommended, including Teachable Machines, Machine Learning for Kids, and Learning for ML. Kindergarten students can benefit from learning tools such as PopBots, while software devices such as Scratch and Python can be introduced to demystify core AI principles to primary school students and create AI-driven solutions and models for secondary school students. Although hardware such as robotics and physical artifacts are generally effective, they may be costly for scalability.

This review reveals that constructivism, constructionism, and computational thinking are instrumental in addressing AI literacy education. Unfortunately, little research has adopted theoretical frameworks or conceptual models of reference for AI curricula, educational activities, or the design of AI learning tools and applications. To guide teaching, learning and effectiveness in using AI learning tools within AI literacy education, AI literacy learning theoretical frameworks are needed to guide the teaching instruction of kindergarten, primary and secondary school students. Usability, AI ethics, and transparency must be addressed in tool design to ensure that issues pertaining to data privacy and security will not arise. Moreover, there is currently insufficient theory-based, rigorous research on the effectiveness of AI educational tools to meet the diverse learning needs of students. Children may be invited to codesign with application designers. Thus, researchers may conduct theory-based and outcome-oriented quantitative and qualitative research on AI educational tools, which may be significantly beneficial to students.

More evaluation and documented analysis regarding the effectiveness of learning tools should be conducted to inform stakeholders of the existing trends in the field, pedagogical strategies, and instructional methods for teacher professional development.

More research, analysis, and evidence are needed to determine the effectiveness of AI learning tools before they are scaled up based on a risk-benefit analysis. Researchers should also clearly define the educational settings in which specific AI learning tools are appropriate to support the effective delivery of AI content in the classroom.

Recommendations

For educators.

Aside from providing students with AI knowledge and skills that the market demands (Burgsteiner et al., 2016 ) and encouraging all citizens to be AI literate (Goel, 2017 ; Pedro et al., 2019 ), educators may promote holistic AI literacy education by considering humans, nonhumans (e.g., animals and machines) (Yim, 2023 ) and environmental elements (Miao & Shiohira, 2022 ) in their teaching content. Ethical questions should also be considered, including inclusivity, fairness, responsibility, transparency, data justice, and social responsibility (Crawford, 2021 ; Benjamin, 2019 ). To provide a roadmap for sustainable AI education implementation and development, it is essential to involve teachers in the design of learning tools and understand their perceptions regarding AI literacy education, as well as provide pedagogical strategies, resource development, and needs-based professional training for both preservice and in-service teachers.

For teachers

Children learn best at a certain stage of cognitive development (Ghazi & Ullah, 2015 ). It is recommended that the content of instruction is consistent with students’ cognitive developmental level, as it influences their readiness and ability to learn (Piaget, 2000 ). As a result, the technical and content depth of the educational learning tools should align with students’ age and the teaching objectives, and teachers should understand students’ cognitive development to plan age-appropriate activities with suitable learning tools. More collaboration among teachers with various pedagogical experiences across various educational levels may lead to more innovative and efficient teaching processes.

For researchers

Researchers should report evidence of the reliability. and validity of their findings where applicable since such data are crucial to evaluating the quality of their recommended learning tools or pedagogies. This can also aid other academics in updating their research on existing and developing pedagogical strategies. Researchers may consider designing and developing a standardized AI assessment mechanism that can be used across different grade levels to compare students’ AI literacy. This approach permits the standardization of assessment criteria and instructional feedback and thus better supports the wider application of AI teaching in K-12 classrooms.

References marked with an asterisk indicate reports included in this scoping review

*Ali, S., Payne, B. H., Williams, R., Park, H. W., & Breazeal, C. (2019, June). Constructionism, ethics, and creativity: Developing primary and middle school artificial intelligence education. In International workshop on education in artificial intelligence k-12 (eduai’19) (pp. 1–4).

Bai, H., & Yang, S. (2019, October). Research on the sustainable development model of information technology literacy of normal students based on deep learning recommendation system. In 2019 4th International conference on mechanical, control and computer engineering (icmcce) (pp. 837–8373). IEEE.

Bargh, J. A., & Schul, Y. (1980). On the cognitive benefits of teaching. Journal of Educational Psychology, 72 (5), 593.

Article   Google Scholar  

Battal, A., Afacan Adanır, G., & Gülbahar, Y. (2021). Computer science unplugged: A systematic literature review. Journal of Educational Technology Systems, 50 (1), 24–47.

Benjamin, R. (2020). Race after technology: Abolitionist tools for the new Jim code.

Berthelsen, D. (2009). Participatory learning. In Participatory learning in the early years: Research and pedagogy (pp. 1–11). Routledge.

Burgsteiner, H., Kandlhofer, M., & Steinbauer, G. (2016). Irobot: Teaching the basics of artificial intelligence in high schools. In Proceedings of the 30th AAAI conference on artificial intelligence (pp. 4126–4127). AAAI Press.

Burnard, P., Dragovic, T., Jasilek, S., Biddulph, J., Rolls, L., Durning, A., & Fenyvesi, K. (2022). The art of co-creating arts-based possibility spaces for fostering STE (A) M practices in primary education. In Arts-based methods in education around the world (pp. 247–281). River Publishers.

Çakiroğlu, Ü., Suiçmez, S. S., Kurtoğlu, Y. B., Sari, A., Yildiz, S., & Öztürk, M. (2018). Exploring perceived cognitive load in learning programming via Scratch. Research in Learning Technology, 26 , 1–20.

Cetindamar, D., Kitto, K., Wu, M., Zhang, Y., Abedin, B., & Knight, S. (2022). Explicating AI literacy of employees at digital workplaces. IEEE Transactions on Engineering Management, 71 , 810–823.

*Chai, C. S., Wang, X., & Xu, C. (2020). An extended theory of planned behavior for the modelling of Chinese secondary school students’ intention to learn Artificial Intelligence. Mathematics, 8 (11), 2089.

Chang, C. C., Hwang, G. J., & Tu, Y. F. (2022). Concept mapping in technology-supported K-12 education: A systematic review of selected SSCI publications from 2001 to 2020. Journal of Educational Computing Research, 60 (7), 1637–1662.

Chang, C. Y., Lai, C. L., & Hwang, G. J. (2018). Trends and research issues of mobile learning studies in nursing education: A review of academic publications from 1971 to 2016. Computers & Education, 116 , 28–48.

Chen, L., Chen, P., & Lin, Z. (2020). Artificial intelligence in education: A review. Ieee Access, 8 , 75264–75278.

Chiu, T. K. F., & Chai, C. S. (2020). Sustainable curriculum planning for artificial intelligence education: A self-determination theory perspective. Sustainability, 12 (14), 5568.

*Chiu, T. K. F., Meng, H., Chai, C. S., King, I., Wong, S., & Yam, Y. (2021). Creation and evaluation of a pre-tertiary Artificial Intelligence (AI) curriculum. IEEE Transactions on Education , 65 (1), 30–39.

Cope, B., & Kalantzis, M. (Eds.). (2016). A pedagogy of multiliteracies: Learning by design . Springer.

Google Scholar  

Crawford, K. (2021). The atlas of AI: Power, politics, and the planetary costs of artificial intelligence . Yale University Press.

Book   Google Scholar  

*Dai, Y., Lin, Z., Liu, A., Dai, D., & Wang, W. (2023). Effect of an analogy-based approach of artificial intelligence pedagogy in upper primary schools. Journal of Educational Computing Research . https://doi.org/10.1177/0735633123120134 .

Danniels, E., & Pyle, A. (2023). Inclusive play-based learning: Approaches from enacting kindergarten teachers. Early Childhood Education Journal, 51 (7), 1169–1179.

Darmawansah, D., Hwang, G. J., Chen, M. R. A., & Liang, J. C. (2023). Trends and research foci of robotics-based STEM education: A systematic review from diverse angles based on the technology-based learning model. International Journal of STEM Education, 10 (1), 1–24.

*Estevez, J., Garate, G., & Graña, M. (2019). Gentle introduction to artificial intelligence for high-school students using scratch . IEEE Access, 7 , 179027–179036.

*Fernández-Martínez, C., Hernán-Losada, I., & Fernández, A. (2021). Early introduction of AI in Spanish middle schools. A motivational study. KI-Künstliche Intelligenz, 35 (2), 163–170.

Finkelstein, J. E. (2006). Learning in real time: Synchronous teaching and learning online (Vol. 5). Wiley.

Ghazi, S. R., & Ullah, K. (2015). Concrete operational stage of Piaget’s cognitive development theory: An implication in learning general science. Gomal University Journal of Research, 31 (1), 78–89.

Goel, A. (2017). AI education for the world. AI Magazine, 38 (2), 3–4.

*Gong, X., Tang, Y., Liu, X., Jing, S., Cui, W., Liang, J., & Wang, F. Y. (2020, October). K-9 Artificial Intelligence education in Qingdao: Issues, challenges and suggestions. In 2020 IEEE international conference on networking, sensing and control (ICNSC) (pp. 1–6). IEEE.

*Gong, X., Wu, Y., Ye, Z., & Liu, X. (2018, June). Artificial Intelligence course design: iSTREAM-based visual cognitive smart vehicles. In 2018 IEEE intelligent vehicles symposium (IV) (pp. 1731–1735). IEEE.

Green, M. C. (2004). Transportation into narrative worlds: The role of prior knowledge and perceived realism. Discourse Processes, 38 (2), 247–266.

*Gunasilan, U. (2021). Debate as a learning activity for teaching programming: A case in the subject of machine learning. Higher Education, Skills and Work-Based Learning , 12 (4), 705–718.

*Han, X., Hu, F., Xiong, G., Liu, X., Gong, X., Niu, X., Shi, W., & Wang, X. (2018). Design of AI+ curriculum for primary and secondary schools in Qingdao. In 2018 Chinese automation congress (CAC) (pp. 4135–4140). IEEE.

*Heinze, C. A., Haase, J., & Higgins, H. (2010, July). An action research report from a multi-year approach to teaching artificial intelligence at the k-6 level. In 1st AAAI symposium on educational advances in artificial intelligence .

*Henry, J., Hernalesteen, A., & Collard, A. S. (2021). Teaching Artificial Intelligence to K-12 through a role-playing game questioning the intelligence concept. KI-Künstliche Intelligenz, 35 (2), 171–179.

*Ho, J. W., Scadding, M., Kong, S. C., Andone, D., Biswas, G., Hoppe, H. U., & Hsu, T. C. (2019, June). Classroom activities for teaching artificial intelligence to primary school students. In Proceedings of international conference on computational thinking education (pp. 157–159).

Hourcade, J. P. (2015). Child–computer interaction . Self, Iowa City.

Hsu, Y. C., Ho, H. N. J., Tsai, C. C., Hwang, G. J., Chu, H. C., Wang, C. Y., & Chen, N. S. (2012). Research trends in technology-based learning from 2000 to 2009: A content analysis of publications in selected journals. Journal of Educational Technology & Society, 15 (2), 354–370.

Hung, W., Jonassen, D. H., & Liu, R. (2008). Problem-based learning. In Handbook of research on educational communications and technology (Vol. 3(1), pp. 485–506). Erlbaum Associates.

*Kahn, K. M., Megasari, R., Piantari, E., & Junaeti, E. (2018). AI programming by children using snap! Block programming in a developing country. In EC-TEL practitioner proceedings 2018: 13th European conference on technology enhanced learning .

Kandlhofer, M., Steinbauer, G., Hirschmugl-Gaisch, S., & Huber, P. (2016). Artificial intelligence and computer science in education: From kindergarten to university. In 2016 IEEE frontiers in education conference (FIE) (pp. 1—9). IEEE.

Kajiwara, Y., Matsuoka, A., & Shinbo, F. (2023). Machine learning role playing game: Instructional design of AI education for age-appropriate in K-12 and beyond. Computers and Education: Artificial Intelligence, 5 , 100162.

Karalekas, G., Vologiannidis, S., & Kalomiros, J. (2023). Teaching machine learning in K-12 using robotics. Education Sciences, 13 (1), 67.

*Kaspersen, M. H., Bilstrup, K. E. K., Van Mechelen, M., Hjorth, A., Bouvin, N. O., & Petersen, M. G. (2021, June). VotestratesML: A high school learning tool for exploring machine learning and its societal implications. In FabLearn Europe/MakeEd 2021—an international conference on computing, design and making in education (pp. 1–10).

Keri, Z., & Elbatarny, H. S. (2021). The power of analogy-based learning in science. HAPS Educator, 25 (1), 13–20.

*Kilhoffer, Z., Zhou, Z., Wang, F., Tamton, F., Huang, Y., Kim, P., Yeh, T., & Wang, Y. (2023, May). “How technical do you get? I’m an English teacher”: Teaching and learning cybersecurity and AI ethics in high school. In 2023 IEEE symposium on security and privacy (SP) (pp. 2032–2032). IEEE.

Kolb, A. Y., & Kolb, D. A. (2009). Experiential learning theory: A dynamic, holistic approach to management learning, education and development. In The SAGE handbook of management learning, education and development (Vol. 7, p. 42). SAGE.

Lai, C. L., & Hwang, G. J. (2015). High school teachers’ perspectives on applying different mobile learning strategies to science courses: The national mobile learning program in Taiwan. International Journal of Mobile Learning and Organisation, 9 (2), 124–145.

*Lee, I., Ali, S., Zhang, H., DiPaola, D., & Breazeal, C. (2021, March). Developing middle school students’ AI literacy. In Proceedings of the 52nd ACM technical symposium on computer science education (pp. 191–197).

*Lee, S., Mott, B., Ottenbriet-Leftwich, A., Scribner, A., Taylor, S., Glazewski, K., Hmelo-Silver, C. E., & Lester, J. (2020, June). Designing a collaborative game-based learning environment for AI-infused inquiry learning in elementary school classrooms. In Proceedings of the 2020 ACM conference on innovation and technology in computer science education (p. 566).

Leitner, M., Greenwald, E., Wang, N., Montgomery, R., & Merchant, C. (2023). Designing game-based learning for high school artificial intelligence education. International Journal of Artificial Intelligence in Education, 33 , 384–398.

*Li, K., & Song, S. (2019, June). Application of artificial intelligence in primary and secondary schools: A case study of scratch. In International conference on applications and techniques in cyber security and intelligence (pp. 2026–2030). Springer.

*Lin, P., Van Brummelen, J., Lukin, G., Williams, R., & Breazeal, C. (2020, April). Zhorai: Designing a conversational agent for children to explore machine learning concepts. In Proceedings of the AAAI conference on artificial intelligence (Vol. 34(9), pp. 13381–13388).

*Lin, P. Y., Chai, C. S., Jong, M. S. Y., Dai, Y., Guo, Y., & Qin, J. (2021). Modeling the structural relationship among primary students’ motivation to learn artificial intelligence. Computers and Education: Artificial Intelligence, 2 , 100006.

Long, D., & Magerko, B. (2020, April). What is AI literacy? Competencies and design considerations. In Proceedings of the 2020 CHI conference on human factors in computing systems (pp. 1–16).

*Lucas, J. M. (2009). K-6 outreach using “Computer science unplugged”. Journal of Computing Sciences in Colleges, 24 (6), 62–63.

Lye, S. Y., & Koh, J. H. L. (2014). Review on teaching and learning of computational thinking through programming: What is next for K-12? Computers in Human Behavior, 41 , 51–61.

Mahon, J., Quille, K., Mac Namee, B., & Becker, B. A. (2022, March). A novel machine learning and artificial intelligence course for secondary school students. In Proceedings of the 53rd ACM technical symposium on computer science education (p. 1155).

*Mariescu-Istodor, R., & Jormanainen, I. (2019, November). Machine learning for high school students. In Proceedings of the 19th Koli calling international conference on computing education research (pp. 1–9).

Markham, T., Larmer, J., & Ravitz, J. (2003). Project based learning handbook: A guide to standards-focused project based learning for middle and high school teachers. Buck Institute for Education, Novato.

Marques, L. S., Gresse von Wangenheim, C., & Hauck, J. C. (2020). Teaching machine learning in school: A systematic mapping of the state of the art . Informatics in Education, 19 (2), 283–321.

*Martins, R. M., von Wangenheim, C. G., Rauber, M. F., & Hauck, J. C. (2023). Machine learning for all!—introducing machine learning in middle and high school . International Journal of Artificial Intelligence in Education . https://doi.org/10.1007/s40593-022-00325-y .

McCarthy, J. (2004). What is artificial intelligence? Engineering Materials and Design, 32 (3), 1–14.

McMillan, J. H., & Schumacher, S. (2010). Research in education: Evidence-based inquiry . Pearson.

*Melsión, G. I., Torre, I., Vidal, E., & Leite, I. (2021, June). Using explainability to help children understand gender bias in AI. In Proceedings of the interaction design and children , Athens, Greece (pp. 87–99).

Miao, F., Holmes, W., Huang, R., & Zhang, H. (2021). AI and education: A guidance for policymakers . UNESCO Publishing.

Miao, F., & Shiohira, K. (2022). K-12 AI curricula . UNESCO Publishing.

Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & Prisma Group. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Medicine, 6 (7), e1000097.

*Ng, D. T. K., & Chu, S. K. W. (2021). Motivating students to learn AI through social networking sites: A case study in Hong Kong. Online Learning, 25 (1), 195–208.

Ng, D. T. K., Leung, J. K. L., Su, M. J., Yim, I. H. Y., Qiao, M. S., & Chu, S. K. W. (2023). AI literacy in K-16 classrooms . Springer.

*Ng, D. T. K., Luo, W. Y., Chan, H. M. Y., & Chu, S. K. W. (2022). An examination on primary students’ development in AI Literacy through digital story writing.  Computers and Education: Artificial Intelligence 3, 100054.

*Norouzi, N., Chaturvedi, S., & Rutledge, M. (2020, April). Lessons learned from teaching machine learning and natural language processing to high school students. In Proceedings of the AAAI conference on artificial intelligence (Vol. 34(9), pp. 13397–13403).

Papert, S., & Solomon, C. (1971). Twenty things to do with a computer. In S. Papert, C. Solomon, E. Soloway, & J. C. Spohrer (Eds.), Studying the novice programmer (pp. 3–28). Lawrence Erlbaum Associates.

Park, Y., & Shin, Y. (2021). Tooee: A novel scratch extension for K-12 big data and artificial intelligence education using text-based visual blocks. IEEE Access, 9 , 149630–149646.

Pedaste, M., Mäeots, M., Siiman, L. A., De Jong, T., Van Riesen, S. A., Kamp, E. T., Manoli, C. C., Zacharia, Z. C., & Tsourlidaki, E. (2015). Phases of inquiry-based learning: Definitions and the inquiry cycle. Educational Research Review, 14 , 47–61.

Pedro, F., Subosa, M., Rivas, A., & Valverde, P. (2019). Artificial intelligence in education: Challenges and opportunities for sustainable development. Roscongress Building Trust.

*Perach, S., & Alexandron, G. (2022, June). A blended-learning program for implementing a rigorous machine-learning curriculum in high-schools. In Proceedings of the 9th ACM conference on learning@ Scale (pp. 267–270).

Piaget, J. (2000). Piaget’s theory of cognitive development. In Childhood cognitive development: The essential readings (Vol. 2, pp. 33–47). Blackwell.

Priya, S., Bhadra, S., Chimalakonda, S., & Venigalla, A. S. M. (2022). ML-Quest: A game for introducing machine learning concepts to K-12 students. Interactive Learning Environments. DOI, 10 (1080/10494820), 2022.

Qin, J. J., Ma, F. G., & Guo, Y. M. (2019). Foundations of artificial intelligence for primary school.

Resnick, M., Myers, B., Nakakoji, K., Shneiderman, B., Pausch, R., Selker, T., & Eisenberg, M. (2005). Design principles for tools to support creative thinking. In Creativity support tools: Report from a US national science foundation sponsored workshop .

Rina, R., Kluzer, S., & Punie, Y. (2022). DigComp 2.2: The Digital competence framework for citizens-with new examples of knowledge, skills and attitudes (No. JRC128415). Joint Research Centre (Seville site).

Rizvi, S., Waite, J., & Sentance, S. (2023). Artificial Intelligence teaching and learning in K-12 from 2019 to 2022: A systematic literature review. Computers and Education: Artificial Intelligence, 4 , 100145.

*Rodríguez-García, J. D., Moreno-León, J., Román-González, M., & Robles, G. (2020, October). Introducing artificial intelligence fundamentals with LearningML: Artificial intelligence made easy. In 8th International conference on technological ecosystems for enhancing multiculturality (pp. 18–20).

*Rodríguez-García, J. D., Moreno-León, J., Román-González, M., & Robles, G. (2021, March). Evaluation of an online intervention to teach artificial intelligence with LearningML to 10–16-year-old students. In Proceedings of the 52nd ACM technical symposium on computer science education (pp. 177–183).

*Sabuncuoglu, A. (2020, June). Designing one year curriculum to teach artificial intelligence for middle school. In Proceedings of the 2020 ACM conference on innovation and technology in computer science education (pp. 96–102).

*Sakulkueakulsuk, B., Witoon, S., Ngarmkajornwiwat, P., Pataranutaporn, P., Surareungchai, W., Pataranutaporn, P., & Subsoontorn, P. (2018, December). Kids making AI: Integrating machine learning, gamification, and social context in STEM education. In 2018 IEEE international conference on teaching, assessment, and learning for engineering (TALE) (pp. 1005–1010). IEEE.

Sanusi, I. T., & Oyelere, S. S. (2020). Pedagogies of machine learning in K-12 context. In 2020 IEEE frontiers in education conference (FIE) (pp. 1–8). IEEE.

Sayers, A. (2008). Tips and tricks in performing a systematic review. British Journal of General Practice, 58 (547), 136–136.

*Scherz, Z., & Haberman, B. (1995, March). Logic programming based curriculum for high school students: The use of abstract data types. In  Proceedings of the 26th SIGCSE technical symposium on computer science education (pp. 331–335).

Sestino, A., & De Mauro, A. (2022). Leveraging artificial intelligence in business: Implications, applications and methods. Technology Analysis & Strategic Management, 34 (1), 16–29.

*Shamir, G., & Levin, I. (2021). Neural network construction practices in elementary school.  KI-Künstliche Intelligenz, 35 (2), 181–189.

*Shamir, G., & Levin, I. (2022). Teaching machine learning in elementary school. International Journal of Child–Computer Interaction, 31 , 100415.

Smerdon, B. A., Burkam, D. T., & Lee, V. E. (1999). Access to constructivist and didactic teaching: Who gets it? Where is it practiced? Teachers College Record, 101 (1), 5–34.

Smith, B. L., & MacGregor, J. T. (1992). What is collaborative learning. In Collaborative learning: A sourcebook for higher education . National Center on Postsecondary Teaching, Learning, and Assessment Publishing, Pennsylvania State University.

Su, J., & Yang, W. (2022). Artificial intelligence in early childhood education: A scoping review. Computers and Education: Artificial Intelligence, 3 , 100049.

Su, J., & Yang, W. (2023). AI literacy curriculum and its relation to children’s perceptions of robots and attitudes towards engineering and science: An intervention study in early childhood education. Journal of Computer Assisted Learning . https://doi.org/10.1111/jcal.12867

Su, J., Zhong, Y., & Ng, D. T. K. (2022). A meta-review of literature on educational approaches for teaching AI at the K-12 levels in the Asia-Pacific region. Computers and Education: Artificial Intelligence, 3 , 100065.

Su, J., Guo, K., Chen, X., & Chu, S. K. W. (2023). Teaching artificial intelligence in K–12 classrooms: a scoping review. Interactive Learning Environments , 1–20.

Su, J., Ng, D. T. K., & Chu, S. K. W. (2023). Artificial intelligence (AI) literacy in early childhood education: The challenges and opportunities. Computers and Education: Artificial Intelligence, 4 , 100124.

*Toivonen, T., Jormanainen, I., Kahila, J., Tedre, M., Valtonen, T., & Vartiainen, H. (2020, July). Co-designing machine learning apps in K-12 with primary school children. In  2020 IEEE 20th International Conference on Advanced Learning Technologies (ICALT) (pp. 308–310). IEEE.

Trybus, J. (2015). Game-based learning: What it is, why it works, and where it's going (p. 6). New Media Institute.

*Tseng, T., Murai, Y., Freed, N., Gelosi, D., Ta, T. D., & Kawahara, Y. (2021, June). PlushPal: Storytelling with interactive plush toys and machine learning. In Proceedings of the interaction design and children (pp. 236–245).

Tu, Y. F., & Hwang, G. J. (2020). Trends and research issues of mobile learning studies in hospitality, leisure, sport and tourism education: A review of academic publications from 2002 to 2017. Interactive Learning Environments, 28 (4), 385–403.

Van Brummelen, J., Heng, T., & Tabunshchyk, V. (2021, May). Teaching tech to talk: K-12 conversational artificial intelligence literacy curriculum and development tools. In Proceedings of the AAAI conference on artificial intelligence (Vol. 35(17), pp. 15655–15663).

*Vartiainen, H., Tedre, M., & Valtonen, T. (2020). Learning machine learning with very young children: Who is teaching whom ?. International Journal of Child-Computer Interaction, 25 , 100182.

*Voulgari, I., Zammit, M., Stouraitis, E., Liapis, A., & Yannakakis, G. (2021, June). Learn to machine learn: designing a game based approach for teaching machine learning to primary and secondary education students. In Proceedings of the interaction design and children (pp. 593–598).

*Wan, X., Zhou, X., Ye, Z., Mortensen, C. K., & Bai, Z. (2020, June). SmileyCluster: Supporting accessible machine learning in K-12 scientific discovery. In Proceedings of the interaction design and children conference (pp. 23–35).

Wang, T., & Cheng, E. C. K. (2021). An investigation of barriers to Hong Kong K-12 schools incorporating Artificial Intelligence in education. Computers and Education: Artificial Intelligence, 2 , 100031.

Wegerif, R. (2007). Dialogic education and technology: Expanding the space of learning (Vol. 7). Springer, London.

Wegerif, R. (2011). Towards a dialogic theory of how children learn to think. Thinking Skills and Creativity, 6 (3), 179–190.

*Williams, R., Park, H. W., & Breazeal, C. (2019b). A is for artificial intelligence: The impact of artificial intelligence activities on young children’s perceptions of robots. In Proceedings of the 2019 CHI conference on human factors in computing systems .

*Williams, R., Park, H. W., Oh, L., & Breazeal, C. (2019a, July). Popbots: Designing an artificial intelligence curriculum for early childhood education. In Proceedings of the AAAI conference on artificial intelligence (Vol. 33(1), pp. 9729–9736).

Wong, K.-C. (2020). Computational thinking and artificial intelligence education: A balanced approach using both classical AI and modern AI. In: Proceedings of international conference on computational thinking education 2020 (p. 108), The Education University of Hong Kong.

Yim, I. H. Y. (2023). Design of Artificial Intelligence (AI) education for primary schools: Arts-based approach (pp. 65–90). ISTES BOOKS.

*Yoder, S., Tatar, C., Aderemi, I., Boorugu, S., Jiang, S., & Akram, B. (2020). Gaining insight into effective teaching of AI problem-solving through CSEDM: A case study. In  5th Workshop on computer science educational data mining.

Zhou, X., Van Brummelen, J., & Lin, P. (2020). Designing AI learning experiences for K-12: Emerging works, future opportunities and a design framework. arXiv preprint. arXiv:2009.10228 .

Download references

Acknowledgements

The support and guidance provided by Professor Rupert Wegerif, Dr Annouchka Bayley, Dr Eleanor Dare and the University of Cambridge during this ongoing academic pursuit have been instrumental and greatly appreciated.

This research has not received any funding.

Author information

Authors and affiliations.

Faculty of Education, University of Cambridge, 184 Hills Road, Cambridge, CB2 8PQ, UK

Iris Heung Yue Yim

Faculty of Education, The University of Hong Kong, Pok Fu Lam, Hong Kong SAR, China

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Iris Heung Yue Yim .

Ethics declarations

Conflict of interest.

The authors declare no conflict of interest.

Informed consent

Not applicable.

Research involving human participants and/or animals

Ethical statements.

We hereby declare that this manuscript is the result of our independent creation under the reviewers’ comments. Except for the quoted contents, this manuscript does not contain any research achievements that have been published or written by other individuals or groups. We are the only authors of this manuscript. The legal responsibility of this statement shall be borne by us.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix 1: Overview of the selected articles

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Yim, I.H.Y., Su, J. Artificial intelligence (AI) learning tools in K-12 education: A scoping review. J. Comput. Educ. (2024). https://doi.org/10.1007/s40692-023-00304-9

Download citation

Received : 15 July 2023

Revised : 04 October 2023

Accepted : 25 October 2023

Published : 06 January 2024

DOI : https://doi.org/10.1007/s40692-023-00304-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Artificial intelligence literacy
  • K-12 students
  • AI literacy education
  • Learning tools
  • Find a journal
  • Publish with us
  • Track your research

Become an Insider

Sign up today to receive premium content.

Home

The Evolution of Technology in K–12 Classrooms: 1659 to Today

Bio Photo of Alexander Huls

Alexander Huls is a Toronto-based writer whose work has appeared in  The New York Times ,  Popular Mechanics ,  Esquire ,  The Atlantic  and elsewhere.

In the 21st century, it can feel like advanced technology is changing the K–12 classroom in ways we’ve never seen before. But the truth is, technology and education have a long history of evolving together to dramatically change how students learn.

With more innovations surely headed our way, why not look back at how we got to where we are today, while looking forward to how educators can continue to integrate new technologies into their learning?

DISCOVER:  Special education departments explore advanced tech in their classrooms.

Using Technology in the K–12 Classroom: A History

1659: magic lantern.

  • Inventor:  Christiaan Huygens
  • A Brief History:  An ancestor of the slide projector, the magic lantern projected glass slides with light from oil lamps or candles. In the 1680s, the technology was brought to the education space to show detailed anatomical illustrations, which were difficult to sketch on a chalkboard.
  • Interesting Fact:  Huygens initially regretted his creation, thinking it was too frivolous.

1795: Pencil

  • Inventor:  Nicolas-Jacques Conté
  • A Brief History : Versions of the pencil can be traced back hundreds of years, but what’s considered the modern pencil is credited to Conté, a scientist in Napoleon Bonaparte’s army. It made its impact on the classroom, however, when it began to be mass produced in the 1900s.
  • Interesting Fact:  The Aztecs used a form of graphite pencil in the 13th century.

1801: Chalkboard

  • Inventor:  James Pillans
  • A Brief History:  Pillans — a headmaster at a high school in Edinburgh, Scotland — created the first front-of-class chalkboard, or “blackboard,” to better teach his students geography with large maps. Prior to his creation, educators worked with students on smaller, individual pieces of wood or slate. In the 1960s, the creation was upgraded to a green board, which became a familiar fixture in every classroom.
  • Interesting Fact:  Before chalkboards were commercially manufactured, some were made do-it-yourself-style with ingredients like pine board, egg whites and charred potatoes.

1888: Ballpoint Pen

  • Inventory:  John L. Loud
  • A Brief History:  John L. Loud invented and patented the first ballpoint pen after seeking to create a tool that could write on leather. It was not a commercial success. Fifty years later, following the lapse of Loud’s patent, Hungarian journalist László Bíró invented a pen with a quick-drying special ink that wouldn’t smear thanks to a rolling ball in its nib.
  • Interesting Fact:  When ballpoint pens debuted in the U.S., they were so popular that Gimbels, the department store selling them, made $81 million in today’s money within six months.

LEARN MORE:  Logitech Pen works with Chromebooks to combine digital and physical learning.

1950s: Overhead Projector

  • Inventor:  Roger Appeldorn
  • A Brief History:  Overhead projects were used during World War II for mission briefings. However, 3M employee Appeldorn is credited with creating not only a projectable transparent film, but also the overhead projectors that would find a home in classrooms for decades.
  • Interesting Fact:  Appeldorn’s creation is the predecessor to today’s  bright and efficient laser projectors .

1959: Photocopier

  • Inventor:  Chester Carlson
  • A Brief History:  Because of his arthritis, patent attorney and inventor Carlson wanted to create a less painful alternative to making carbon copies. Between 1938 and 1947, working with The Haloid Photographic Company, Carlson perfected the process of electrophotography, which led to development of the first photocopy machines.
  • Interesting Fact:  Haloid and Carlson named their photocopying process xerography, which means “dry writing” in Greek. Eventually, Haloid renamed its company (and its flagship product line) Xerox .

1967: Handheld Calculator

  • Inventor:   Texas Instruments
  • A Brief History:  As recounted in our  history of the calculator , Texas Instruments made calculators portable with a device that weighed 45 ounces and featured a small keyboard with 18 keys and a visual display of 12 decimal digits.
  • Interesting Fact:  The original 1967 prototype of the device can be found in the Smithsonian Institution’s  National Museum of American History .

1981: The Osborne 1 Laptop

  • Inventor:  Adam Osborne, Lee Felsenstein
  • A Brief History:  Osborne, a computer book author, teamed up with computer engineer Felsenstein to create a portable computer that would appeal to general consumers. In the process, they provided the technological foundation that made modern one-to-one devices — like Chromebooks — a classroom staple.
  • Interesting Fact:  At 24.5 pounds, the Osborne 1 was about as big and heavy as a sewing machine, earning it the current classification of a “luggable” computer, rather than a laptop.

1990: World Wide Web

  • Inventor:  Tim Berners-Lee
  • A Brief History:  In the late 1980s, British scientist Berners-Lee created the World Wide Web to enable information sharing between scientists and academics. It wasn’t long before the Web could connect anyone, anywhere to a wealth of information, and it was soon on its way to powering the modern classroom.
  • Interesting Fact:  The first web server Berners-Lee created was so new, he had to put a sign on the computer that read, “This machine is a server. DO NOT POWER IT DOWN!”

Click the banner  to access customized K–12 technology content when you sign up as an Insider.

K-12 Insider Mobile Devices

What Technology Is Used in Today’s K–12 Classrooms?

Technology has come so far that modern classrooms are more technologically advanced than many science labs were two decades ago. Students have access to digital textbooks,  personal devices , collaborative  cloud-based tools , and  interactive whiteboards . Emerging technologies now being introduced to K–12 classrooms include voice assistants, virtual reality devices and 3D printers.

Perhaps the most important thing about ed tech in K–12 isn’t what the technology is, but how it’s used.

How to Integrate Technology into K–12 Classrooms

The first step to integrating technology into the K–12 classroom is  figuring out which solution to integrate , given the large variety of tools available to educators. That variety comes with benefits — like the ability to align tech with district objectives and grade level — but also brings challenges.

“It’s difficult to know how to choose the appropriate digital tool or resource,” says Judi Harris, professor and Pavey Family Chair in Educational Technology at the William & Mary School of Education. “Teachers need some familiarity with the tools so that they understand the potential advantages and disadvantages.”

Dr. Judi Harris

Judi Harris Professor and Pavey Family Chair in Educational Technology, William and Mary School of Education

K–12 IT leaders should also be careful not to focus too much on technology implementation at the expense of curriculum-based learning needs. “What districts need to ask themselves is not only whether they’re going to adopt a technology, but how they’re going to adopt it,” says Royce Kimmons, associate professor of instructional psychology and technology at Brigham Young University.

In other words, while emerging technologies may be exciting, acquiring them without proper consideration of their role in improving classroom learning will likely result in mixed student outcomes. For effective integration, educators should ask themselves, in what ways would the tech increase or support a student’s productivity and learning outcomes? How will it improve engagement?

Integrating ed tech also requires some practical know-how. “Teachers need to be comfortable and confident with the tools they ask students to use,” says Harris.

Professional development for new technologies is crucial, as are supportive IT teams, tech providers with generous onboarding programs and technology integration specialists. Harris also points to initiatives like YES: Youth and Educators Succeeding, a nonprofit organization that prepares students to act as resident experts and classroom IT support.

KEEP READING:  What is the continued importance of professional development in K–12 education?

But as educational technology is rolled out and integrated, it’s important to keep academic goals in sight. “We should never stop focusing on how to best understand and help the learner to achieve those learning objectives,” says Harris.

That should continue to be the case as the technology timeline unfolds, something Harris has witnessed firsthand during her four decades in the field. “It’s been an incredible thing to watch and to participate in,” she notes. “The great majority of teachers are extremely eager to learn and to do anything that will help their students learn better.”

articles about educational tools

  • Professional Development

Related Articles

Susan Enfield

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT

Copyright © 2024 CDW LLC 200 N. Milwaukee Avenue , Vernon Hills, IL 60061 Do Not Sell My Personal Information

  • Open access
  • Published: 11 April 2018

The role of pedagogical tools in active learning: a case for sense-making

  • Milo Koretsky   ORCID: orcid.org/0000-0002-6887-4527 1 ,
  • Jessie Keeler 1 ,
  • John Ivanovitch 2   an1 &
  • Ying Cao 1  

International Journal of STEM Education volume  5 , Article number:  18 ( 2018 ) Cite this article

11k Accesses

14 Citations

10 Altmetric

Metrics details

Evidence from the research literature indicates that both audience response systems (ARS) and guided inquiry worksheets (GIW) can lead to greater student engagement, learning, and equity in the STEM classroom. We compare the use of these two tools in large enrollment STEM courses delivered in different contexts, one in biology and one in engineering. Typically, the research literature contains studies that compare student performance for a group where the given active learning tool is used to a control group where it is not used. While such studies are valuable, they do not necessarily provide thick descriptions that allow instructors to understand how to effectively use the tool in their instructional practice. Investigations on the intended student thinking processes using these tools are largely missing. In the present article, we fill this gap by foregrounding the intended student thinking and sense-making processes of such active learning tools by comparing their enactment in two large-enrollment courses in different contexts.

The instructors studied utilized each of the active learning tools differently. In the biology course, ARS questions were used mainly to “check in” with students and assess if they were correctly interpreting and understanding worksheet questions. The engineering course presented ARS questions that afforded students the opportunity to apply learned concepts to new scenarios towards improving students’ conceptual understanding. In the biology course, the GIWs were primarily used in stand-alone activities, and most of the information necessary for students to answer the questions was contained within the worksheet in a context that aligned with a disciplinary model. In the engineering course, the instructor intended for students to reference their lecture notes and rely on their conceptual knowledge of fundamental principles from the previous ARS class session in order to successfully answer the GIW questions. However, while their specific implementation structures and practices differed, both instructors used these tools to build towards the same basic disciplinary thinking and sense-making processes of conceptual reasoning, quantitative reasoning, and metacognitive thinking.

Conclusions

This study led to four specific recommendations for post-secondary instructors seeking to integrate active learning tools into STEM courses.

Introduction

Our program recently interviewed faculty candidates for an open position. During the interview process, each candidate was asked to conduct a 20-min teaching demonstration. One candidate, a tenured associate professor from a large, public research university, had regularly taught core courses. He enthusiastically stated that he had incorporated active learning into his courses and asked to use clickers as part of the demonstration. In the first 15 min, the candidate delivered a transmission-oriented PowerPoint presentation on heat transfer. This lecture portion was followed with a multiple-choice clicker question. In the question, the instructor provided a short word problem related to the material and an equation that he had just presented. He asked the audience to select which variable in the equation was the unknown among a list of variables that appeared in the equation. All the information needed to answer the question was provided in the question stem, and it could clearly be answered simply by variable identification, independently of understanding the material that had been presented earlier. More insidiously, this question reinforced an undesirable schooling practice of many students—searching a source to find an appropriate equation and variable matching. When asked his objective for incorporating clickers into his course, the candidate stated, “I just want to make sure my students are awake.”

Motivated by the overwhelming evidence that demonstrates the effectiveness of active learning over traditional lecture in science, technology, engineering, and mathematics (STEM) courses (e.g., Freeman et al. 2014 ; Hake 1998 ; Prince 2004 ), many instructors are seeking to transform their classroom practice to incorporate active learning (Borrego et al. 2010 ; Felder and Brent 2010 ). However, as illustrated by the vignette above, these instructional practices can be taken up in a range of ways, and the instructor’s conception of learning is critical. We believe that the faculty member above chose to employ clicker technology in a way that made sense to him and that less productive enactments of active learning can be logical interpretations of research studies that predominantly focus on the effectiveness of a practice relative to its absence. In this qualitative, comparative case study, we investigate the ways experienced instructors choreograph such activity in their courses to produce learning and thereby seek to provide a complementary lens for instructors to productively implement active learning in their course.

We call the clicker applied in the vignette above an active learning tool . Tools are used in instruction to place students in an environment where they interact in intentional ways with the content and with other students and the instructor to promote learning. Tools can be technology-oriented like the clicker technology above or pedagogically oriented like the guided inquiry worksheets we describe below and often combine aspects of both orientations. Researchers who study the efficacy of these tools typically compare student performance for a group where the given tool is used to a control group where it is not used. Such research focuses on the tool’s effect (what learning gains does it produce?) and how to use the tool (what do instructors need to learn about to use it?). In many cases, incorporation of tools provides evidence of increased learning outcomes. However, this avenue of research implicitly can lead to undesired translations to practice based solely on considerations of procedure about how to use the tool, as illustrated in the vignette above. Investigations on the intended student thinking processes (not performance gains) using these tools are largely missing. In the present article, we fill this gap by foregrounding the intended thinking and sense-making processes of such technological and pedagogical tools.

We compare the use of active learning tools in two STEM courses delivered in different contexts (one in biology and the other in engineering). Both courses use the same two tools: audience response systems (ARS) and guided inquiry worksheets (GIW). They are both taught by instructors experienced with active learning pedagogies and recognized as high-quality and innovative instructors by their peers and students. We are interested in how implementation of these tools varied between courses and in identifying threads common to both. We focus on the intended student sense-making and thinking processes as the instructors integrate the tools into their courses. By sense-making, we follow Campbell et al. ( 2016 ) to mean that learners are “working on and with ideas—both students’ ideas (including experiences, language, and ways of knowing) and authoritative ideas in texts and other materials—in ways that help generate meaningful connections” (p. 19). Our goal is not to compare learning gains in these two courses in order to claim one instructor’s implementation strategy works better than the other. Rather through analysis of the similarities and differences in the course design and practices, we seek to provide a lens into how active learning comes to “life,” and to provide instructors productive ways to think about how they can best integrate active learning into their classroom learning environment.

We ask the following research questions. In two large-enrollment undergraduate STEM courses in different disciplinary contexts:

What types of thinking and sense-making processes do instructors intend to elicit from students during their use of ARS questions? During their use of GIW questions? What are the similarities and differences between the intended uses of these tools in the two courses studied?

In what ways do the intended sense-making processes that are elicited through the use of the ARS and GIW tools align with the instructors’ broader perspectives and beliefs about the instructional system for their courses?

To situate this study, we first provide an overview of the research on ARS and GIW tools. We then describe the thinking and sense-making processes on which we will focus to understand the ways that the instructors in this study use the tools in concert and how they integrate them to achieve outcomes of instruction.

Audience response systems as tools

ARS, such as clickers, have been used increasingly in post-secondary STEM classroom to allow instructors to shift large classes from a transmission-centered lecture mode into active learning environments (Freeman et al. 2014 ; Hake 1998 ; Prince 2004 ). Typically, the instructor provides the class a multiple-choice conceptual question, and each student in the class responds by selecting an answer on a device. In some cases, students are also asked to provide written explanations justifying their answer selection (Koretsky et al. 2016 ). Aggregate responses are available for the instructor to display to the class in real time. Often, students are asked to discuss answers in small groups, in a whole class discussion, or both (Nicol and Boyle 2003 ).

ARS tools elicit live, anonymous responses from each individual student allowing students in the class to answer new questions in a safe manner free from judgment of peers and the instructor (Lantz 2010 ). In addition, real-time availability of the answer distribution can provide immediate and frequent feedback and allows for adaptable instruction. Based on student responses, instructors can modify class discussion and activity to meet learning needs that are more representative of the entire class rather than just a few vocal students. However, instructors also have concerns about incorporating ARS in classes, including fear about covering less content, less control in the student-centered classroom, and the time and effort needed to learn the technology and develop good questions (Caldwell 2007 ; Duncan 2005 ; MacArthur and Jones 2008 ).

The research literature on ARS use has focused broadly on both student engagement and student learning. Synthesis of individual research studies has shifted from more descriptive review papers (Caldwell 2007 ; Duncan 2005 ; Fies and Marshall 2006 ; Kay and LeSage 2009 ; MacArthur and Jones 2008 ) to more systematic meta-analyses (Castillo-Manzano et al. 2016 ; Chien et al. 2016 ; Hunsu et al. 2016 ; Nelson et al. 2012 ) that use common metrics and statistical methods to relate the characteristics and findings of a set of studies that are selected from explicit criteria (Glass et al. 1981 ). In general, researchers report ARS tools promote student engagement by improved attendance, higher engagement in class, and greater interest and self-efficacy (Caldwell 2007 ; Kay and LeSage 2009 ; Hunsu et al. 2016 ) and also suggest that anonymity increases engagement (Boscardin and Penuel 2012 ; Lantz 2010 ).

Research on student learning with ARS tools often takes an interventionist approach, comparing classes or sections where instructors use the ARS to those that only lecture (Chien et al. 2016 ; Castillo-Manzano et al. 2016 ) or, occasionally, contrasting ARS technology with the same in-class questions delivered without using ARS technology, such as by raising hands, response cards, or paddles (Chien et al. 2016 ; Elicker and McConnell 2011 ; Mayer et al. 2009 ). Learning gains are often measured from instructor-developed assessments, such as in-class exams (Caldwell 2007 ), but more robust psychometric instruments such as concept inventories have also been used (Hake 1998 ). Results generally, but not always, show improved outcomes (Hunsu et al. 2016 ; Boscardin and Penuel 2012 ). These reports also acknowledge the relationship between ARS use, and learning is complex (Castillo-Manzano et al. 2016 ). Many factors have been suggested to influence it, such as the depth of the instructor’s learning objectives (Hunsu et al. 2016 ), testing effects (Chien et al. 2016 ; Lantz 2010 ; Mayer et al. 2009 ), and the extent of cognitive processing (Beatty et al. 2006 ; Blasco-Arcas et al. 2013 ; Mayer et al. 2009 ; Lantz 2010 ) and social interactions (Blasco-Arcas et al. 2013 ; Chien et al. 2016 ; Penuel et al. 2006 ).

In summary, there is large and growing body of literature that has examined the use of ARS tools in STEM courses. These studies suggest that they are effective in eliciting student engagement and learning, especially in large classes.

Guided inquiry worksheets as tools

GIW are material tools that guide students through inquiry learning during class. In general, inquiry learning seeks to go beyond content coverage and engage students in the practices of doing science or engineering, e.g., investigating a situation, constructing and revising a model, iteratively solving a problem, or evaluating a solution (National Research Council 1996 ; de Jong and Van Joolingen 1998 ). However, inquiry can be challenging for students since it requires a set of science process skills (e.g., posing questions, planning investigations, analyzing and interpreting data, providing explanations, and making predictions) in addition to content knowledge (National Research Council 2011 ; Zacharia et al. 2015 ). In guided inquiry, instructional scaffolds provide support to help students effectively engage in scientific practices around inquiry (Keselman 2003 ; de Jong 2006 ). There have been several pedagogies that embody inquiry learning which range from less guided approaches like problem-based learning (PBL) to more guided approaches like process-oriented guided inquiry learning (POGIL, Eberlein et al. 2008 ).

Guided inquiry learning activities are pedagogically grounded and guide students through specific preconceived phases of inquiry (Pedaste et al. 2015 ). For example, both POGIL (Bailey et al. 2012 ) and peer-led team learning (PLTL, Lewis and Lewis 2005 , 2008 ; Lewis 2011 ) are designed to guide students through a three-phase learning cycle (Abraham and Renner 1986 ): (i) the exploration phase where students search for patterns and meaning in data/models; (ii) the invention phase to align thinking around an integrating concept; and (iii) the application phase to extend the concept to new situations. Similarly, pedagogically grounded inquiry-based learning activities (IBLAs, Laws et al. 1999 ; Prince et al. 2016 ) contain three phases intended to produce a cognitive conflict that elicits students to confront core conceptual ideas: (i) the prediction phase where students make predictions about counter-intuitive situations; (ii) the observation phase where they observe an experiment or conduct a simulation; and (iii) the reflection phase which consists of writing about the differences and connecting to theory.

GIW are commonly used as tools to provide carefully crafted key questions that guide students through the conceived phases of inquiry during class (Douglas and Chiu 2009 ; Eberlein et al. 2008 ; Farrell et al. 1999 ; Lewis and Lewis 2008 ). Lewis ( 2011 ) describes GIW as typically containing from six to 12 questions that vary between a conceptual and procedural nature. Questions often progress in complexity (Bailey et al. 2012 ; Hanson and Wolfskill 2000 ). First, they might ask students to explore a concept, thereby activating their prior knowledge. Then, they ask students to interact with models and develop relationships and finally elicit students to apply the learned concepts to new situations, thereby generalizing their knowledge and understanding. When inquiry is centered on observations of a phenomenon, GIW provide a tool for students to write down both their initial predictions and their observations, thereby producing a written record that they must reconcile (Prince et al. 2016 ).

Similar to findings on ARS tools, the research literature indicates that guided inquiry pedagogies promote engagement (Abraham and Renner 1986 ; Bailey et al. 2012 ), learning (Abraham and Renner 1986 ; Lewis 2011 ; Prince et al. 2016 ; Wilson et al. 2010 ), and equity (Lewis and Lewis 2008 ; Lewis 2011 ; Wilson et al. 2010 ) in the STEM classroom.

Thinking and sense-making processes

Our study situates the intersection of pedagogical strategies and content delivery in the intended thinking and sense-making processes of students as they engage in active learning tasks. We take a constructivist perspective of learning (National Research Council 2000 ; Wheatley 1991 ) that views new knowledge results from students’ restructuring of existing knowledge in response to new experiences and active sense-making. This restructuring process is effectively carried out through interactions with other students in groups (Chi and Wylie 2014 ; Cobb 1994 ). From this perspective, a key aspect of instruction then becomes to create and orchestrate these experiences.

Educators can design and implement learning activities in ways to cultivate productive thinking and sense-making processes while delivering course content. As emphasized in STEM 2026 , a vision for innovation in STEM education , “[a]lthough the correct or well-reasoned answer or solution remains important, STEM 2026 envisions focus on the process of getting to the answer, as this is critical for developing and measuring student understanding.” (Tanenbaum 2016 , p. 33).

Conceptual reasoning

In this study, conceptual reasoning refers to the reasoning processes where individuals and groups draw on foundational disciplinary concepts and apply them in new situations (National Research Council 2000 , 2013 ). Elements of conceptual reasoning include (but are not limited to) identifying appropriate concepts when analyzing a new problem or situation, understanding those concepts and their relationship to the context, and applying the concepts to solve problems or explain phenomena. (Russ and Odden 2017 ; Zimmerman 2000 ). Facility with concepts and principles has been identified as a feature of thinking that distinguishes disciplinary experts from novices (National Research Council 2000 ).

Researchers have suggested several changes from traditional instructional design that better align with developing students’ conceptual reasoning (e.g., Chari et al. 2017 ; National Research Council 2000 , 2013 ). First, instruction should shift to more in-depth analysis of fewer topics that allows focus and articulation of key, cross-cutting concepts (National Research Council 2013 ). In doing so, the curriculum must provide sufficient number of cases to allow students to work with the key concepts in several varied contexts within a discipline (National Research Council 2000 ). Second, classroom activities should provide students with opportunities to practice conceptual reasoning on a regular basis. Instructors can prompt this practice by asking students questions which require conceptual reasoning. They should also hold students accountable for such reasoning by participating in discussion, modeling thinking, and steering students away from rote procedural operations towards conceptual reasoning (Chari et al. 2017 ).

Quantitative reasoning

Quantitative reasoning addresses analysis and interpretation of numerical data and application of quantitative tools to solve problems (Grawe 2016 ), as well as mathematical sense-making—the process of seeking coherence between the structure of the mathematical formalism and the relations in the real world (Dreyfus et al. 2017 ; Kuo et al. 2013 ). Quantitative reasoning has been recognized as a key learning outcome for twenty-first century college graduates (Association of American Colleges and Universities 2005 ). Quantitative reasoning at the college level includes processes such as translating between verbal, graphical, numeric, and symbolic representations; interpreting measured data or mathematical models; and using mathematical methods to numerically solve problems (Engelbrecht et al. 2012 ; Mathematical Association of America 1994 ; Zimmerman 2000 ). The word “reasoning” suggests the synthesis of quantitative concepts into the greater whole (Mathematical Association of America 1994 ) and emphasizes the process of performing the analysis instead of merely the product that results from it. In the context of upper division college courses, quantitative reasoning tends to be even more sophisticated where “the connections between formalism, intuitive conceptual schema, and the physical world become more compound (nested) and indirect” (Dreyfus et al. 2017 , p. 020141-1).

Quantitative reasoning reflects the incorporation of mathematical knowledge and skills into disciplinary contexts (Mathematical Association of America 1994 ). In science and engineering, quantitative reasoning can include making sense of measured data and connecting it to physical phenomena (Bogen and Woodward 1988 ) or developing mathematical models that predict and generalize (Lehrer 2009 ; Lesh and Doerr 2003 ). Thus, the use of mathematics extends beyond the procedures and algorithms that students sometimes take as synonymous with the field. Researchers claim that mathematical sense-making is possible and productive for learning and problem solving in university science and engineering courses (e.g., Dreyfus et al. 2017 ; Engelbrecht et al. 2012 ).

In addition, conceptual reasoning and quantitative reasoning are intertwined in disciplinary practice and should be cultivated in tandem. Researchers have identified several ways that conceptual reasoning aids in science and engineering problem solving, including conceptualization for finding the equations to mathematically solve the problem, checking and interpreting the result after the equation is solved, and the processes working through to the solution (Kuo et al. 2013 ). Zimmerman ( 2000 ) points out that domain-specific concepts, i.e., “thinking within the discipline,” and domain-general quantification skills (e.g., evaluating experimental evidence) “bootstrap” each other and, when conducted together, lead to deeper understanding and richer disciplinary knowledge and skills.

In a culture that often focuses on and rewards procedural proficiency, it can be challenging to engage students in quantitative reasoning (Engelbrecht et al. 2012 ). Active learning strategies can help (Grawe 2016 ). Strategies include emphasizing accuracy relative to precision, asking students to create visual representations of data or translate between representations, asking students to communicate about their quantitative work, and setting assignments in an explicit, real-world context (Bean 2016 ; Grawe 2016 ; MacKay 2016 ).

Metacognitive thinking

Metacognition often refers to “thinking about thinking” or “second-order-thinking,” the action and ability to reflect on one’s thinking (Schoenfeld 1987 ). Research evidence suggests that metacognition develops gradually and is as dependent on knowledge as experience (National Research Council 2000 ). Ford and Yore ( 2012 ) argued that critical thinking, metacognition, and reflection converge into metacognitive thinking and can improve the overall level of one’s thinking.

Vos and De Graaff ( 2004 ) claimed that active learning tasks, such as working on projects in engineering courses, do not just require metacognitive knowledge and skills but also encourage the development of the learners’ metacognitive thinking. Based on several decades of research literature, Lin ( 2001 ) concluded that there are two basic approaches to developing students’ metacognitive skills: training in strategy and designing supportive learning environments.

Veenman ( 2012 ) pointed out three principles for the successful instruction of metacognitive thinking: (1) Metacognitive instruction should be embedded in the context of the task; (2) learners should be informed about the benefit of applying metacognitive skills; and (3) instruction and training should be repeated over time rather than being a one-time directive. When designing STEM curriculum in an integrated way, an issue that becomes central is determining the aspects of metacognition and the context in which the aspects should be taught (Dori et al. 2017 ).

To answer our research questions, we use a comparative case study of two STEM courses implementing both an ARS and GIW. Data for this study were collected within a larger institutional change initiative whose goal is to improve instruction of large-enrollment STEM courses across disciplinary departments through implementation of evidence-based instructional practices (Koretsky et al. 2015 ). Through multiple data sources, we seek to provide a thick description (Geertz 1994 ) of how and why instructors use these active learning tools in large classes.

Case selection

We selected courses based on the regular use of ARS and GIW tools as part of classroom instruction. In addition, we sought courses in different disciplinary contexts and department cultures since the instructors would more likely show variation in tool use. Based on these criteria, we identified courses in biology, in engineering, and in a third STEM discipline. Based on the instructor’s willingness to cooperate, we ended up investigating the biology and the engineering course in this study. Both are taught in the same public university, have large student enrollments, and are required courses for students majoring in the respective disciplines. Both instructors had experience using these tools for several terms prior to our investigation and were identified by peers and students as excellent educators.

The biology course, Advanced Anatomy and Physiology , is the third course in an upper division sequence series required for biology majors. Prerequisites for the course included the completion of the preceding courses in the year-long sequence and concurrent or previous enrollment in the affiliated lab section. The enrollment was 162 students in the term studied. The engineering course, Material Balances , is the first course of a required three-course sequence for sophomores majoring in chemical, biological, and environmental engineering. The enrollment was 307 students in the term studied.

Data collection and analysis

Data sources include a series of interviews with each instructor, classroom observations and instructional artifacts, and student response records to ARS questions.

Instructor interview protocol

We conducted four semi-structured interviews with each instructor over the span of three academic years. The first (year 1) and fourth (year 3) interviews were reflective interviews that probed the instructors’ general teaching practices and instructional beliefs. They included questions about department and college duties, interactions with other faculty regarding teaching and learning, perceptions about successful students, and responses to the larger change initiative. The second and third interviews (year 2) focused specifically on the ARS and GIW questions, respectively, applied to a specific delivery of the instructor’s course and are described in more detail below. All interviews were audio-recorded.

The interviews on the ARS and GIW questions sought to elicit the instructor’s understandings of the questions they assigned and their rationale for assigning them, the reasons and purposes of why they used the tool (ARS or GIW), and how they used these tools in the greater context of their courses. To investigate the intended types of student thinking processes for each active learning tool, we asked the instructors to write out their solutions to the questions following a think-aloud protocol (Ericsson 2006 ). For these interviews, each instructor was interviewed by a researcher who had deep domain knowledge in the course content under study. The researcher provided the instructors with hard copies of selected ARS questions (Interview 2) and GIW questions (Interview 3) and asked them to write their responses on them, which were then collected for analysis. To gain insight into the instructor’s perception of the questions when he or she positioned him- or herself as a learner, we began each interview with the following prompt: “I want you to imagine you’re looking at these questions from a student’s perspective, and I want you to talk through how you would answer each one.” The think-aloud portion was followed by a reflective portion. After the instructors worked through all the questions, they were directed through the question set a second time with the prompt: “What was your rationale when assigning this question?”

Selection of ARS questions for think-aloud interview

For the interview with the biology instructor, we decided it was feasible, given the brevity of the biology ARS questions, to have the instructor work through all 31 ARS questions delivered in the Friday POGIL sessions in an hour-long interview. The engineering ARS questions required more time than the biology questions, and we decided it was not feasible to expect the instructor to work through 31 ARS questions in an hour-long interview. We chose a subset of diverse questions that were representative of the whole set. Criteria for choosing questions included the difficulty of the question (determined by the percent correct scores from the students), the topic of each question (based on the topics outlined in the course syllabus), question clarity, and how the percent correct changed if peer instruction was involved. Following these criteria, we selected 15 of the 31 questions to present to the engineering instructor.

Selection of GIW questions for think-aloud interview

During interviews, we asked the instructors to engage with GIW questions from selected weekly worksheets. We selected a single guided inquiry worksheet for the think-aloud interview with the biology instructor. The worksheet focused on parameters of vascular function. We chose this worksheet based on the instructor’s input that it was representative of the type of worksheets students would encounter during a guided inquiry session for her course. For the engineering course, the instructor expressed two approaches to worksheet development. One was more conceptually based and one was more computational. We chose two worksheets, one for each approach. The first worksheet was selected because it was the first worksheet that applied the concept of a material balance (conservation of mass); this concept was then incorporated into almost all of the subsequent guided inquiry worksheets in the class. The second worksheet was a later-term worksheet that asked students to use Excel to perform calculations and then answer questions involving quantitative and qualitative answers. We first asked the instructors to answer the GIW questions as if they were students encountering the worksheet for the first time and subsequently asked them to explain their rationale for placing each question on the worksheet.

Interview analyses

We transcribed all the interviews verbatim and analyzed interviewee responses using emergent coding. For the think-aloud interviews, this process was used to identify the general intended thinking processes that occurred as instructors worked and talked through ARS and GIW questions from the perspective of the students in their course. We examined each ARS or GIW question response the instructors gave and identified individual steps. We assigned code to each step describing what the instructor was doing at that step. Then, we recognized sets of individual steps from different questions belong to more general categories that represented the broader type of thinking and sense-making processes described in our theoretical framework (i.e., conceptual reasoning, quantitative reasoning, and metacognitive thinking). In such cases, we grouped them into a more general code category. For example, codes such as “use graphical information to qualitatively explain situation,” “relate information to physical representation,” and “identify relationships between variables” were grouped in the more general code category “conceptual reasoning.” Similarly, codes such as “develop equations to describe phenomena,” “rearrange equation to identify relationships between variables,” and “perform calculations” were grouped together in the more general code category “quantitative reasoning.” The approach of identifying categories from specific thinking processes led to a reliable coding process. By grouping, we were able to develop a general set of codes that connect the data with our theoretical framework. We could then compare thinking processes (i) between the courses representing different disciplines and (ii) between different pedagogical tools within each course. Table  1 provides the final set of code categories for the intended thinking processes during the think-aloud interviews including a description of the code and a sample interview quote for each code.

For the reflective interviews, we sought to relate the ways the instructors used the ARS and GIW tools to their priorities about what students would learn and their conceptions about how students learn and the role of teaching in learning (Thompson 1992 ). Code categories were determined as follows. One researcher coded the interview transcript initially and developed a set of emergent categories based on elements of the instructional system that the instructors mentioned. The research team then met and reconciled the categories. This process resulted in the following subset of categories that were relevant to this study: instructional scaffolding, constructivism, social interactions, formative assessment, summative assessment, and sense-making.

For both think-aloud and reflective interviews, a single researcher with appropriate expertise performed the coding. A subset of the interview transcripts (20–25%) was then coded by a second researcher to ensure reliability (Cohen’s kappa = 0.80 [think-aloud], 0.84 [reflective]).

Other measures

We used several other data sources to triangulate our interview data analysis and interpretation. Both classes were observed several times using the Teaching Dimensions Observation Protocol (TDOP, Hora et al. 2013 ). While the report of TDOP codes for each course is beyond the scope of this article, the observation process allowed researchers to get familiar with the course structure and the context in which the ARS and GIW active learning tools were used. The observations were supported with instructional artifacts including the course syllabus, all the ARS questions and GIW activities, and the week’s lecture notes for the analyzed GIW activities. Student responses for ARS questions were collected through the web-based platform that each instructor used to assign questions and receive student responses. The response data were used to verify our interpretation of the different intents of the instructors in their use of ARS questions.

Context of tool use

In this section, we describe how each instructor situates the use of ARS and GIW tools within the activity structure of their courses. This description is based on analysis of the course observations and course artifacts and triangulated with the reflective interviews.

Table  2 provides a summary of the differences in the context and use of the active teaching tools between the biology and engineering courses. We unpack these enactments when we present the results.

Biology course

The biology course met three times per week with more traditional lecture periods on Monday and Wednesday and an active learning period of POGIL guided inquiry sessions on Friday. ARS questions were used in most class periods, whereas the GIW tool was used only in the Friday POGIL class periods. Over the 10-week term, students answered a total of 98 ARS questions, 31 of them during the POGIL GIW sessions. They completed nine GIW in total. An average of around 110 out of 162 students attended GIW sessions.

Figure  1 shows an example of a biology ARS question after it was delivered in class. ARS questions were presented to students as PowerPoint slides, and the students answered using clickers. The instructor typically displayed the distribution of students’ choices on the screen shortly thereafter and briefly commented on the correct answer. Students answered between two to five questions per class period. Our think-aloud interview with the biology instructor focused on the ARS questions delivered on the Friday POGIL sessions.

Example of a biology ARS question as it was delivered in class

The instructor used a GIW tool to facilitate student activity during the guided inquiry sessions on Fridays. A typical Friday session (50 min) consisted of an introduction (2 min), student group work using the GIW and facilitated by undergraduate learning assistants (15–25 min), ARS questions and group discussion (5–10 min), continued student group work (10–15 min), and wrap up (2–5 min). During these sessions, the classroom was divided into seven territories with a learning assistant assigned to each territory. The instructor assigned students to groups of three which were maintained for the duration of the term. Students collaboratively worked on a GIW answering an average of 19 worksheet questions during each session. The GIWs engaged students in interpreting models of an underlying concept as well as providing data that students used to answer questions. Some worksheets also contained “extension questions” that students engaged in outside of class or in class if they finished the worksheets ahead of other groups. Extension questions typically consisted of two types, classified by the instructor as (i) big picture questions , consisting of open-ended questions to stimulate discussion and look at larger concepts, and (ii) clinical correlation questions , situating concepts in clinical applications.

Figure  2 shows the first part of the GIW tool that we used for the think-aloud interview. The worksheet contained three different models; each model was followed by 2 to 9 questions students answered based on information from the models. The worksheet ended with six “extension questions” that the students answered outside of class time. Although these extension questions required more critical thinking than the previous questions, they still typically referenced the models contained in the worksheet.

The first part of the biology inquiry-based worksheet used in the think-aloud study

Engineering course

The engineering course has two 50-min lectures on Monday and Friday attended by all students. ARS questions were delivered during the 50 min Wednesday sessions. GIWs were used during the 50-min Thursday sessions. Over the course of the 10-week term, the students answered a total of 31 ARS questions and completed nine GIW in total. An average of 285 out of 307 students responded to questions during the ARS sessions, and almost all students attended the GIW sessions.

In the Wednesday sessions, students were divided into two sections of approximately 150 students and ARS questions were delivered via the AIChE Concept Warehouse (Koretsky et al. 2014 ) by the primary instructor to each section separately. Figure  3 shows an example of an engineering ARS question. Students responded to the questions using their laptops, smartphones, or tablets, and the answers were stored in a database. Along with their multiple-choice answer, students were asked to provide a written justification of their answer as well as a confidence score between one and five to indicate how sure they were of their answer. For about half of the ARS questions (15 questions), the engineering instructor used an adaption of the peer instruction (PI) pedagogy (Mazur 1997 ) where students first answered a posted question individually, then discussed their answers with their neighbors, and then re-answered the same question again (including written justification and confidence). For the 16 non-PI questions, the instructor displayed the responses and discussed the question with the whole class after the students answered individually.

Example of an engineering ARS question as it was delivered in class

On Thursdays, students attended smaller “studio” sessions of approximately 30–36 students where they completed GIW activities. Most studios were facilitated by a graduate teaching assistant (GTA) who would sometimes briefly (1–2 min) introduce the topic. For most of the studio period, students spent their time actively working. Some worksheets involved an introduction section that was to be completed individually, while others solely contained group work. Studio groups contained three or four students. Group members were kept the same for 5 weeks, after which students were assigned to new groups by the GTA for the remainder of the 10-week term. GTAs were coached to respond to student questions with subsequent guiding questions, as opposed to providing students with direct answers to the worksheet questions. Students were given 50 min to work on the worksheets which were collected at the end of each studio session.

An example of part of a GIW used in the engineering think-aloud interview is shown in Fig.  4 . Most worksheets involve a main problem that includes a problem statement with relevant information students need to use to solve the worksheet. The problem is then broken down into different steps that students complete to help them reach an answer. These steps are modeled after problem solving processes introduced in lecture and may include questions which require students to read and interpret the problem statement, to draw and label a diagram that represents what is happening in the problem, to formulate and solve appropriate equations, and to relate the problem statement to real-world engineering scenarios.

The first part of one of the engineering inquiry-based worksheets that was used in the think-aloud study

In the following sections, we answer our two research questions with evidence from our data analysis.

Answer to RQ1: What are the instructors’ intended thinking processes during use of ARS and GIW tools? What are the similarities and differences between instructors?

Our answer to Research Question 1 is based on analysis of think-aloud interviews for the ARS questions and the GIW questions, triangulated by student responses of ARS questions, researcher in-class observations, and the instructors’ reflective interviews.

Thinking processes elicited from the ARS and GIW questions

Table  3 shows the percentage of questions identified for each of the categories of thinking process during the think-aloud interviews (see Table  1 for category definitions). Results from the ARS questions are shown in the left column for each course, and results from the GIW questions are next to them to the right. The majority of ARS questions in the biology course focused on immediate recall (75%). While the biology instructor expected students to be able to recognize a concept in 25% of the questions, there was no evidence that the ARS questions were intended to evoke further conceptual reasoning. In contrast, the engineering instructor rarely sought to elicit immediate recall (7%), but rather to provide students experiences where they needed to select information from questions (47%) and recognize concepts (73%) to prompt conceptual reasoning (73%). In addition, the majority of the questions also included elements of quantitative reasoning (60%).

During her think-aloud interview, the biology instructor showed a wide range of intended scientific thinking processes in responding to the GIW questions, including conceptual reasoning (42%), quantitative reasoning (42%), and metacognitive thinking (32%). These processes were not sequestered but rather the instructor integrated each one around thinking about models of the fluid dynamics of vascular function. The worksheet tended to be “stand alone” with information usually found in the question (95%) rather than intending students to recall information from lecture (21%). The biology worksheet is self-contained in the sense that most of the information students need to complete the worksheet is provided via the models. In contrast, during the think-aloud interview, the engineering instructor intended students to spend the majority of time engaged in quantitative reasoning (60%) with only a small amount of time in conceptual reasoning (5%). There is also less metacognitive thinking in these activities (10%) than that in the biology course. In addition, the engineering instructor intended students to reference previous knowledge and information presented in lecture (85%) to a much larger degree than the biology instructor.

While the uses of ARS and GIW tools in each course is distinct and different, inspection of Table  3 shows that in either course by the time students completed that week’s active learning activity, they were intended to significantly engage around a key disciplinary topic in two of the aspects of thinking and sense-making: conceptual reasoning and quantitative reasoning. Thus, the “coverage” of the topic extends beyond declarative content and patterns of problem solving. Rather, it emphasizes productive ways to think and reason in the discipline. The biology instructor had more explicit intended metacognitive thinking than the engineering instructor (32 vs. 10%). However, for all 31 ARS questions, the engineering instructor had students rate their confidence (see Fig.  3 ), so while he did not allude to metacognitive thinking as much during the think-aloud interviews, there was some of this type of thinking built into the technology tool.

Student ARS performance

We next present student performance data from the ARS questions for each course. These data show differences in types of questions asked and implementation and reflect differences in intended thinking processes discussed previously. Figure  5 shows the percentage of students who answered correctly for each question when the ARS questions were delivered in class. Results from the 31 ARS questions delivered during Friday POGIL sessions in the biology course are shown chronologically with red diamonds (labeled BIO). Students performed well in general averaging 89.3% correct (solid red line) with a standard deviation of 10.5. In the engineering course, students’ initial responses are shown with solid dark blue circles (ENGR pre-PI). They averaged 58.5% correct (solid dark blue line) on these questions with a standard deviation of 20.2. For the 15 questions where the engineering instructor used the PI pedagogy, the post-PI question results are shown by powder blue circles. Their average correct was 80.0% (powder blue line) with a standard deviation of 14.8%. Of the questions where PI was used, scores increased by an average of 18.0%, showing benefit of peer discussion, although there were two questions where scores significantly decreased (Q12 and Q21).

ARS question student performance data

The nature of the ARS questions is clearly different in these two courses: the engineering questions were more difficult and took more class time, and when peer instruction pedagogy was used, they were asked twice. These differences both reflect the context where a weekly class period was dedicated to the ARS questions in engineering and the intended thinking processes during the think-aloud interviews with the instructors shown in Table  3 . We next explore the reflective interviews with the instructors to see how these uses align with their conceptions of how these tools fit into their courses to produce learning.

Instructor perceptions of the ARS and GIW questions

In this section, we present excerpts and analysis of the four reflective interviews with each instructor: the year 1 general interview (labeled g-pre) and the year 3 general interview (g-post) focused on more general questions about their instructional practices and beliefs while the year 2 post think-aloud interviews (post think-aloud ARS or GIW) specifically addressed the instructors’ intent in using these tools.

ARS questions

The reflective interviews corroborate the identified differences between the courses in the ways the ARS questions engaged students. In the biology course, ARS questions were used mainly to assess if students were correctly interpreting and understanding worksheet questions or to ask students to recall the material recently introduced in the worksheet. Each of the biology ARS questions was applied once. When questioned about her rationale behind designing ARS questions, the biology instructor acknowledged that they can be a helpful tool to use in large classes.

Biology Instructor (post think-aloud ARS): ...and so you know the value of clicker questions in rooms of greater than 100 people is… in a really big class, I can’t see their sheets, and so I don’t know what they’re thinking. And it’s really useful to check in with them in that way . [italic added for emphasis]

She alluded to the role of ARS questions as “concept checking,” and pointed out that she regularly uses them to “check in” with students to ensure they are engaged and following along in class.

The engineering course presented ARS questions that afforded students the opportunity to apply learned concepts to new scenarios towards improving students’ conceptual understanding. When the engineering instructor was asked about how he wants students to be engaged while solving ARS questions, he explained that he wanted to push students beyond procedural problem solving:

Engineering Instructor (post think-aloud ARS): I guess trying to get them to start to create that knowledge structure in their head. That there are certain conventions and there are certain cues that are gonna help them bend those problems into, you know, help them find a solution... Trying to provide cues that are similar to things they’re gonna see in other parts of the class, homework and exams and so forth, to get them to hone in on those specific concepts and then in some cases manipulate or examine at a level that they’re not gonna get just by plugging and chugging into those equations. [italic added for emphasis]

This line of thinking tied into comments in the year 1 general reflective interview where the engineering instructor referred several times to the ability to conceptually reason by identifying a concept and applying it to a new situation, such as in the following excerpt:

Engineering instructor (g-pre): If you can understand the fundamentals, the fundamental concepts that are governing a process then, you know, if you start to change all these other things, if you can remember that kind of core concept then that goes a long way to carrying these through being able to reason through a solution , where if I just know, I just have some equation memorized...that’s gonna fall apart, you know, when you get to a situation where that equation doesn’t exactly apply. [italic added for emphasis]

This emphasis on reasoning or sense-making from foundational concepts is consistent with the engineering instructor’s choice to devote one class period a week to activity around ARS questions.

The interpretation of the different use of the ARS questions by the two instructors is consistent with the analysis of intended thinking processes from the think-aloud interview (Table  2 ) and the data of percent of correct responses from students (Fig.  5 ). The biology instructor used the ARS as a periodic check-in with students whereas the engineering instructor used ARS questions more extensively as an opportunity for students to develop their understanding and “create that knowledge structure” they needed for adaptive transfer and problem solving.

GIW questions

As we found with the ARS questions, instructors also utilized guided inquiry worksheets differently. During the interview, we asked the biology instructor why she recommend the specific guided inquiry worksheet shown in Fig.  2 for the think-aloud interview. She explained that she thought it was the epitome of a typical GIW for the course.

Biology Instructor (post think-aloud GIW): And so what I really like about POGIL that … this worksheet adheres to is you can get everything you need from this, you know, strictly from the models, and your brain, and thinking about things . And maybe if you don't know what these vessels are, yeah, you could look them up, but you probably do, you know, based on where my students are at. And so, like that’s what I like. This is very much a standalone.

Here, she expresses how the inquiry-based worksheets in the biology course are designed to be self-contained; there is less emphasis to connecting to the information presented in previous lectures or other places and more emphasis to sense-making or as she says, “thinking about things.”

In the engineering course, the GIWs were used in studio sections where the larger class was broken into smaller sections of around 30 students to work on the worksheets. During the interview, the engineering instructor described the relationship between the GIWs and other aspects of the course and especially how they are tied closely to information introduced in lecture.

Engineering Instructor (post think-aloud GIW): I view studio as a really scaffolded and supported place for students to have their first experience applying the principles from lecture. So you kind of get all this information and not a lot of chances to engage it in lecture and before you get a blank problem statement from the homework assignment and are left with a blank page [you get a chance] to walk through the steps or the concepts that are gonna have to be applied as we move forward to homework and exams . Having it be a place where they’ve got classmates they can bounce ideas [off of]…

Here, he clarifies that he views the guided inquiry worksheets as a useful step for students in between being shown ways to solve problems in lecture and applying these problem solving methods on homework assignments and exams. The engineering instructor further elaborated how he envisions the GIW tool sitting within the instructional processes in the year 3 reflective interview:

Engineering instructor (g-post): In these studios where students basically come in and they’re working on a worksheet on a problem that’s related to things that we’ve covered in class, it’s pretty scaffolded, but there’s some open ended components, but they’re kind of working together in groups of three kind of independently with support from a T.A. during that time.

As this excerpt indicates, when considering active learning tools, it is useful to consider other important aspects of the instructional system, as we do next.

Answer to RQ2: In what ways do the intended sense-making processes from the ARS and GIW tools align with the instructors’ broader perspectives and beliefs about the instructional system for their courses?

Our answer to Research Question 2 is based on analysis of the year 1 general interview (g-pre) and year 3 general interview (g-post).

Beliefs about ARS and GIW as active learning tools in instructional systems

In this section, we explore more broadly what the instructors conceive as elements of the instructional system and how the ARS and GIW active learning tools fit within those broader elements. Here, the conceptions of the two instructors generally align.

Table  4 shows category codes for elements of the instructional system and examples of the corresponding instructor beliefs that emerged from analyzing the two reflective interviews with each instructor. The table also provides exemplar excerpts from each instructor. Both instructors expressed the tools provided Instructional Scaffolding that helped guide students’ learning. The excerpt from the biology instructor indicates how she sees scaffolding from the GIW tool as necessary to provide students “a structure to follow,” while the engineering instructor describes the role of each tool to progressively provide students “a learning unit” that created a “cohesive, weekly routine.”

Both instructors used language that was consistent with a constructivist perspective of learning; the biology instructor often referred to students “constructing their own knowledge”, and several times the engineering instructor indicated that he aimed to help students “develop knowledge structures.” In addition, both instructors valued the role of Social Interactions in constructing understanding, expecting students “to interact, not just with the content but with each other to make meaning of the content” (biology instructor) and “talking with their group and grappling with the material” (engineering instructor). Both instructors allude to how instructional tools can provide the impetus for students to interact with one another in sense-making processes.

Both instructors also suggested that the data from ARS questions was useful for Formative Assessment to “see what they’re [i.e., the students are] thinking and where the misconception might be” (biology instructor). The ARS tool allows the instructors to have her or his “finger on the pulse of the class” (engineering instructor) and give students the “opportunity to assess their own learning” (engineering instructor). The engineering instructor also tied this aspect of the instructional system to Social Interactions stating ARS questions give “them an opportunity to communicate what they’ve learned to their peers.”

Both instructors recognized the need to develop disciplinary sense-making aligned with their expressed experiences with Summative Assessments . As the biology instructor states, “I started crafting exams and assessments that were, you know, more about how could students predict, could students look at a set of data and then make inferences from it, and I came to realize that they couldn’t really do that.” This realization motivated her to implement POGIL with GIW in her course to help students develop these skills. Similarly, the engineering instructor recalled a time when he received pushback from students for an exam that was perceived as “unfair.” He explained his “rationale” to them as follows: “if you understood this, the concept from this application then...you know, I was looking to see if you could transfer it and use it over here.” Importantly, both instructors are holding students accountable for higher-level disciplinary thinking processes when they test students, thus aligning the sense-making processes they seek to develop with the active learning tools to the questions on the exams.

In summary, both instructors value and seek to cultivate sense-making processes . The biology instructor describes these processes as “thinking like a biologist” which includes defending answer choices and prompting students to be reflective. The engineering instructor expects students to “figure out what the answer is” by “being able to take a lot of information, break it up into the parts and map it to, again, those concepts that are kind of fundamental, and then use that information to come to a [numerical] solution.”

In this study, we investigated how two instructors used ARS and GIW tools to identify and compare the ways that they intended for students to “get to the answer.” The data show that while the same active learning tools were used in both courses, the way in which students were being asked to engage in problem solving and sense-making varied. In the biology course, ARS questions were used primarily to “check in” with students to see if they were correctly interpreting the worksheet content (e.g., graphs and models) or to ask students to recall the material recently introduced. In the engineering course, ARS questions asked students to apply the concepts covered in lecture to new scenarios towards improving students’ conceptual understanding. These uses reflect the activity structure in each course. The biology course centered on briefly using clickers in almost every class to support instruction (lecture or POGIL). In the engineering course, 1 day and 25% of instruction time was devoted to ARS questions, and the instructor asked students to engage in deeper ways by providing written justification and confidence.

In the biology course, the GIWs were primarily used in stand-alone activities, and most of the information necessary for students to answer the questions was contained within the worksheet. Typically, the information was presented in a context that aligned with a disciplinary model. In the engineering course, the instructor intended for students to reference their lecture notes and rely on their conceptual knowledge of fundamental principles from the previous ARS class session in order to successfully answer the GIW questions. The biology instructor used the worksheets as an opportunity for integrated development of their conceptual reasoning, quantitative reasoning, and metacognitive thinking. On the other hand, the engineering instructor focused primarily on cultivating aspects of quantitative reasoning for problem solving.

In our analysis, we position ARS and GIW as tools that are utilized within instructional systems to produce learning. We have shown that the specific intent of the biology instructor when she uses these tools is very different than the engineering instructor. However, common threads emerged that can be used as ways to consider instruction with active learning tools. Both instructors use these tools to build towards the same basic disciplinary thinking and sense-making processes of conceptual reasoning, quantitative reasoning, and metacognitive thinking. Conceptual reasoning processes that were identified in the think-aloud interviews included intending students to use graphical information to qualitatively explain a situation, relate information to a physical representation, and identify relationships between variables. Quantitative reasoning processes included developing equations to describe phenomena and manipulating equations to reveal the relationship between variables. Metacognitive thinking included considering alternative possible solution strategies and reflecting on the reasonableness of an answer value in relation to a physical system.

Both instructors also clearly intended students to interweave these thinking and sense-making processes. The engineering course design was more sequential where students engaged in conceptual reasoning processes during ARS sessions and then were expected to recall those foundational concepts as they were elicited to quantitatively reason with the GIW activity in studio the following day. The biology course design used “POGIL Fridays” to provide a more integrated active learning experience where conceptual reasoning, quantitative reasoning, and metacognitive thinking were more interlocked.

Both instructors clearly alluded to the value of disciplinary thinking processes in each of their general reflective interviews. However, they did not explicitly identify conceptual reasoning, quantitative reasoning, or metacognitive thinking nor did they appear to make these connections in the post think-aloud interviews when they were more specifically asked about the intent of ARS and GIW tools. Thus, the incorporation of conceptual reasoning, quantitative reasoning, and metacognitive thinking appears to be tacit, even for these experienced and highly regarded instructors. We suggest more direct and explicit emphasis on the ways active learning tools elicit these types of thinking would be beneficial as instructors design activities and integrate them into courses.

Causes of difference in tool use

Hypothetically, we might ask, “If we put one of these instructors in the other’s classroom, how similar would their use of the ARS and GIW tools appear in that different context?” There are several legitimate avenues of inquiry that could be pursued to answer this question. We draw from the extant literature to identify these avenues and assert that considering this complex question from several perspectives is productive.

First, we might consider the instructors’ beliefs and knowledge. As the set of responses in Table  4 indicate, both instructors demonstrated learner-centered beliefs oriented towards learning facilitation as opposed to teacher-centered beliefs oriented towards knowledge transmission (Prosser and Trigwell 1993 ). While they shared common orientations, there could be more subtle differences in their beliefs. Speer ( 2008 ) suggests a more fine-grained characterization of an instructor’s “collection of beliefs” is needed to connect beliefs to specific instructional design choices. Such characterization could provide information about why differences between these instructors’ use of the tools emerged. Alternatively, the instructors’ designs may be influenced by their knowledge about an educational innovation. Rogers ( 2003 ) identifies three types of knowledge needed to implement an innovative tool: awareness knowledge (that the tool exists), how-to knowledge (how to use the tool), and principles knowledge (what purpose the tool serves). In their interviews, each instructor clearly demonstrated awareness and principles knowledge, but differences in how-to knowledge may have led to different enactment strategies. How-to knowledge can be tied to normative use in the department and in the discipline (Norton et al. 2005 ). For example, there may be more (or different) access to POGIL workshops in biology than in engineering. Further investigation of the degree that detailed instructor beliefs and how-to knowledge influence the choice and use of active learning tools is warranted.

Second, we might consider the different disciplinary contexts of the courses, i.e., biology vs. engineering. The National Research Council ( 2012 ) reports that while there are many common pedagogical approaches across science and engineering, there are also “important differences that reflect differences in their parent disciplines and their histories of development” (p. 3). Schwab ( 1964 ) argues that each discipline has a unique “structure” leading to particular ways of thinking. Specifically, he distinguishes between thinking associated with “disciplined knowledge (in biology) over the know-how in the solving of practical problems” (p. 267) in engineering. Ford and Forman ( 2006 ) extend this framing to disciplinary practices. Each discipline has a unique set of fundamental and central practices that need to be articulated and incorporated into a classroom activity. These sociocultural practices provide access to disciplinary specific ways of thinking, knowing, and justifying. They state that a central goal of education is that students develop “a grasp of practice” which includes both disciplined knowledge and “know-how” (p. 27). This line of inquiry suggests that investigations are needed to elucidate the productive ways active learning tools can support disciplinary practices and the way those uses can differ amongst STEM disciplines or among courses within a discipline.

Third, we might consider how the active learning tools were situated within each course’s schedule and institutional resources. The biology class met only in single large-class sections and used undergraduate learning assistants to support POGIL Fridays. The engineering course had dedicated smaller studio sections which were supported by graduate teaching assistants. These different contexts are largely determined by how each department organized classes and support for teaching and likely take sustained effort for an individual instructor to change. Since each course relied upon pedagogically trained student instructors to engage student groups during the use of GIW tools, one of the instructor’s roles was to orchestrate and manage an instructional team. In large courses, productive ways to engage the instructional team can become an integral part of incorporating active learning tools (Seymour 2005 ). In addition, each of the student instructors brings their own knowledge and beliefs about learning to this work (Gardner and Jones 2011 ). Coordinated activity within the department, college, or university, such as programmatic professional development of student instructors, can become a valuable resource. Research is needed to better understand the ways these greater organizational structures enable or constrain the use of active learning tools.

Limitations

This study only examined the practices of two instructors within the same institution. It would be useful to verify the findings with a larger sample of instructors and courses that fit within the criteria of the study. This study focused on the intent of the instructors through think-aloud and reflective interviews triangulated with other data sources. In both courses, students were regularly doing work where they were interacting in small groups. It would be useful to see to what degree students were taking up the thinking and sense-making processes of conceptual reasoning, quantitative reasoning, and metacognitive thinking. This take-up clearly depends on the social aspects of learning, involving interactions between the students themselves and the instructor. It would be useful to examine what types of moves by students promote or short-circuit these sense-making processes amongst the group as well as identifying productive ways for an instructor to intervene to facilitate thinking. Finally, while the same three general intended sense-making processes were identified in both the biology and engineering courses, their manifestation undoubtedly depends on the nature of the specific practices of each discipline. Articulation of the specific ways that practicing biologists and engineers engage in disciplinary sense-making could inform more productive uses of these active learning tools.

Recommendations

This study has led to the following recommendations for post-secondary instructors seeking to integrate active learning tools into STEM courses:

Recommendation 1: When transitioning to active learning, it is common to think about instructional choices in terms of “pedagogies” like POGIL or Peer Instruction or active learning “technologies” like clickers. We encourage instructors to think about these choices in terms of pedagogically and technology-based active learning “tools.” A tool should serve definite educational purposes that are defined prior to use. As with any type of tool, procedural competence is necessary. However, as illustrated in this study, these tools can be used in several ways and their use can become more sophisticated with time.

Recommendation 2: A tool-based orientation should go beyond procedures and prescriptions for delivery. Active learning tools can cultivate disciplinary thinking and sense-making processes that include conceptual reasoning, quantitative reasoning, and metacognitive thinking. Importantly, these processes can bootstrap one another towards deeper understanding (Veenman 2012 ; Zimmerman 2000 ). Thus, in designing activity for students, instructors should consider how to progressively integrate the different types of sense-making processes to support one another towards doing disciplinary work and building disciplinary understanding. Integration can be achieved either through a sequence of activities as the engineering instructor did (i.e., conceptual reasoning with ARS followed by quantitative reasoning with GIW) or within a single activity as the biology instructor did (i.e., conceptual reasoning, quantitative reasoning, and metacognitive thinking with POGIL).

Recommendation 3: Active learning tool use needs to account for course structure and context where deliberate choices support learning goals. The biology instructor enacted POGIL Fridays within a standard MWF lecture schedule. The engineering instructor had a split class on Wednesdays to support use of the ARS tool for conceptual understanding and smaller studio sessions on Thursdays for guided inquiry. Instructors should think about their course structures and, if possible, work with administrators to adapt them for better alignment to the tools that support instructional goals.

Recommendation 4: In using active learning tools to promote disciplinary sense-making, instructors of all levels of experience should take a reflective and iterative view of their instructional practice . For example, both instructors studied here were acknowledged by students and their peers as excellent—a characterization that was supported by the interview data. But, even so, they could reflect on ways to possibly shift their activity with active learning tools to better align with learning goals. The biology instructor might push students towards conceptual reasoning with delivery of ARS questions, and the engineering instructor might modify his GIW with more emphasis on conceptual reasoning and metacognitive thinking. Rather than viewing such changes in instruction inherently as a criticism of teaching prowess, instructors should view ongoing adjustments as a characteristic of masterful practice.

Abraham, MR, & Renner, JW. (1986). The sequence of learning cycle activities in high school chemistry. Journal of Research in Science Teaching , 23 (2), 121–143.

Article   Google Scholar  

Association of American Colleges and Universities. (2005) Liberal Education and America’s Promise. Retrieved 19 December 2017, from https://www.aacu.org/leap

Bailey, CP, Minderhout, V, Loertscher, J. (2012). Learning transferable skills in large lecture halls: Implementing a POGIL approach in biochemistry. Biochemistry and Molecular Biology Education , 40 (1), 1–7.

Bean, J. (2016). Set assignments in an explicit, real-world context. Retrieved 19 December 2017, from https://serc.carleton.edu/sp/library/qr/designing_assignments.html#real

Beatty, ID, Gerace, WJ, Leonard, WJ, Dufresne, RJ. (2006). Designing effective questions for classroom response system teaching. American Journal of Physics , 74 (1), 31–39.

Blasco-Arcas, L, Buil, I, Hernández-Ortega, B, Sese, FJ. (2013). Using clickers in class. The role of interactivity, active collaborative learning and engagement in learning performance. Computers & Education , 62 , 102–110.

Bogen, J, & Woodward, J. (1988). Saving the phenomena. The Philosophical Review , 97 (3), 303–352.

Borrego, M, Froyd, JE, Hall, TS. (2010). Diffusion of engineering education innovations: A survey of awareness and adoption rates in US engineering departments. Journal of Engineering Education , 99 (3), 185–207.

Boscardin, C, & Penuel, W. (2012). Exploring benefits of audience-response systems on learning: A review of the literature. Academic Psychiatry , 36 (5), 401–407.

Caldwell, JE. (2007). Clickers in the large classroom: Current research and best-practice tips. CBE-Life Sciences Education , 6 (1), 9–20.

Campbell, T, Schwarz, C, Windschitl, M. (2016). What we call misconceptions may be necessary stepping-stones toward making sense of the world. Science and Children , 53 (7), 28.

Castillo-Manzano, JI, Castro-Nuño, M, López-Valpuesta, L, Sanz-Díaz, MT, Yñiguez, R. (2016). Measuring the effect of ARS on academic performance: A global meta-analysis. Computers & Education , 96 , 109–121.

Chari, DN, Nguyen, HD, Zollman, DA, & Sayre, EC. (2017). Student and instructor framing in upper-division physics. arXiv preprint arXiv:1704.05103 .

Chi, MT, & Wylie, R. (2014). The ICAP framework: Linking cognitive engagement to active learning outcomes. Educational Psychologist , 49 (4), 219–243.

Chien, YT, Chang, YH, Chang, CY. (2016). Do we click in the right way? A meta-analytic review of clicker-integrated instruction. Educational Research Review , 17 , 1–18.

Cobb, P. (1994). Where is the mind? Constructivist and sociocultural perspectives on mathematical development. Educational Researcher , 23 (7), 13–20.

de Jong, T. (2006). Technological advances in inquiry learning. Science , 312 , 532–533.

de Jong, T, & Van Joolingen, WR. (1998). Scientific discovery learning with computer simulations of conceptual domains. Review of Educational Research , 68 (2), 179–201.

Dori, YJ, Mevarech, ZR, Baker, DR (Eds.) (2017). Cognition, metacognition, and culture in STEM education: Learning, teaching and assessment , (vol. 24). Chas, Switzerland: Springer International Publishing AG.

Douglas, EP, & Chiu, CC (2009). Use of guided inquiry as an active learning technique in engineering. In Proceedings of the 2009 research in engineering education symposium .

Google Scholar  

Dreyfus, BW, Elby, A, Gupta, A, Sohr, ER. (2017). Mathematical sense-making in quantum mechanics: An initial peek. Physical Review Physics Education Research , 13 (2), 020141.

Duncan, D (2005). Clickers in the classroom: How to enhance science teaching using classroom response systems . New York: Addison Wesley and Benjamin Cummings.

Eberlein, T, Kampmeier, J, Minderhout, V, Moog, RS, Platt, T, Varma-Nelson, P, White, HB. (2008). Pedagogies of engagement in science. Biochemistry and Molecular Biology Education , 36 (4), 262–273.

Elicker, JD, & McConnell, NL. (2011). Interactive learning in the classroom: Is student response method related to performance? Teaching of Psychology , 38 (3), 147–150.

Engelbrecht, J, Bergsten, C, Kågesten, O. (2012). Conceptual and procedural approaches to mathematics in the engineering curriculum: Student conceptions and performance. Journal of Engineering Education , 101 (1), 138–162.

Ericsson, KA (2006). Protocol analysis and expert thought: Concurrent verbalizations of thinking during experts’ performance on representative tasks. In The Cambridge handbook of expertise and expert performance , (pp. 223–242).

Chapter   Google Scholar  

Farrell, JJ, Moog, RS, Spencer, JN. (1999). A guided-inquiry general chemistry course. Journal of Chemical Education , 76 (4), 570.

Felder, RM, & Brent, R. (2010). The National Effective Teaching Institute: Assessment of impact and implications for faculty development. Journal of Engineering Education , 99 (2), 121–134.

Fies, C, & Marshall, J. (2006). Classroom response systems: A review of the literature. Journal of Science Education and Technology , 15 (1), 101–109.

Ford, CL, & Yore, LD (2012). Toward convergence of critical thinking, metacognition, and reflection: Illustrations from natural and social sciences, teacher education, and classroom practice. In A Zohar, YJ Dori (Eds.), Metacognition in science education , (pp. 251–271). Dordrecht: Springer-Verlag.

Ford, MJ, & Forman, EA. (2006). Chapter 1: Redefining disciplinary learning in classroom contexts. Review of Research in Education , 30 (1), 1–32.

Freeman, S, Eddy, SL, McDonough, M, Smith, MK, Okoroafor, N, Jordt, H, Wenderoth, MP. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences , 111 (23), 8410–8415.

Gardner, GE, & Jones, MG. (2011). Pedagogical preparation of the science graduate teaching assistant: Challenges and implications. Science Educator , 20 (2), 31.

Geertz, C (1994). Thick description: Toward an interpretive theory of culture. In Readings in the philosophy of social science , (pp. 213–231).

Glass, GV, McGaw, B, Smith, ML (1981). Meta-analysis in social research . Beverly Hills: Sage.

Grawe, N. (2016). Developing quantitative reasoning. Retrieved 19 December 2017, from https://serc.carleton.edu/sp/library/qr/index.html

Hake, RR. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American Journal of Physics , 66 (1), 64–74.

Hanson, D, & Wolfskill, T. (2000). Process workshops—a new model for instruction. Journal of Chemical Education , 77 (1), 120.

Hora, MT, Oleson, A, Ferrare, JJ (2013). Teaching dimensions observation protocol (TDOP) user’s manual . Madison: Wisconsin Center for Education Research.

Hunsu, NJ, Adesope, O, Bayly, DJ. (2016). A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect. Computers & Education , 94 , 102–119.

Kay, RH, & LeSage, A. (2009). Examining the benefits and challenges of using audience response systems: A review of the literature. Computers & Education , 53 (3), 819–827.

Keselman, A. (2003). Supporting inquiry learning by promoting normative understanding of multivariable causality. Journal of Research in Science Teaching , 40 (9), 898–921.

Koretsky, M., Bouwma-Gearhart, J., Brown, S. A., Dick, T., Brubaker-Cole, S. J., Sitomer, A., Quardokus Fisher, K., Risien, J., Little, D. L., Smith, C., & Ivanovitch, J. D. (2015). Enhancing STEM Education at Oregon State University . Paper presented at 2015 ASEE Annual Conference and Exposition, Seattle, Washington. https://doi.org/10.18260/p.24002

Koretsky, MD, Brooks, BJ, Higgins, AZ. (2016). Written justifications to multiple-choice concept questions during active learning in class. International Journal of Science Education , 38 (11), 1747–1765.

Koretsky, MD, Falconer, JL, Brooks, BJ, Gilbuena, DM, Silverstein, DL, Smith, C, Miletic, M. (2014). The AiChE Concept Warehouse: A web-based tool to promote concept-based instruction. Advances in Engineering Education , 4 (1), 7:1–27.

Kuo, E, Hull, MM, Gupta, A, Elby, A. (2013). How students blend conceptual and formal mathematical reasoning in solving physics problems. Science Education , 97 (1), 32–57.

Lantz, ME. (2010). The use of ‘clickers’ in the classroom: Teaching innovation or merely an amusing novelty? Computers in Human Behavior , 26 (4), 556–561.

Laws, P., Sokoloff, D., and Thornton, R. “Promoting active learning using the results of physics education research.” UniServe Science News 13 (1999)

Lehrer, R. (2009). Designing to develop disciplinary dispositions: Modeling natural systems. American Psychologist , 64 (8), 759.

Lesh, RA, & Doerr, HM (2003). Beyond constructivism: Models and modeling perspectives on mathematics problem solving, learning, and teaching . Mahwah, NJ: Lawrence Erlbaum.

Lewis, SE. (2011). Retention and reform: An evaluation of peer-led team learning. Journal of Chemical Education , 88 (6), 703–707.

Lewis, SE, & Lewis, JE. (2005). Departing from lectures: An evaluation of a peer-led guided inquiry alternative. Journal of Chemical Education , 82 (1), 135.

Lewis, SE, & Lewis, JE. (2008). Seeking effectiveness and equity in a large college chemistry course: An HLM investigation of peer-led guided inquiry. Journal of Research in Science Teaching , 45 (7), 794–811.

Lin, X. (2001). Designing metacognitive activities. Educational Technology Research and Development , 49 (2), 23–40.

MacArthur, JR, & Jones, LL. (2008). A review of literature reports of clickers applicable to college chemistry classrooms. Chemistry Education Research and Practice , 9 (3), 187–195.

MacKay, B. (2016). Teaching with visualizations. Retrieved 19 December 2017, from https://serc.carleton.edu/sp/library/visualizations/index.html

Mathematical Association of America. (1994). Retrieved 19 December 2017, from https://www.maa.org/programs/faculty-and-departments/curriculum-department-guidelines-recommendations/quantitative-literacy/quantitative-reasoning-college-graduates

Mayer, RE, Stull, A, DeLeeuw, K, Almeroth, K, Bimber, B, Chun, D, … Zhang, H. (2009). Clickers in college classrooms: Fostering learning with questioning methods in large lecture classes. Contemporary Educational Psychology , 34 (1), 51–57.

Mazur, E (1997). Peer instruction: A user’s manual . Upper Saddle River: Prentice Hall.

National Research Council (1996). National science education standards . Washington, DC: National Academies Press.

National Research Council (2000). How people learn: Brain, mind, experience, and school: Expanded edition . Washington, DC: National Academies Press.

National Research Council (2011). Learning science through computer games and simulations . Washington, DC: National Academies Press.

National Research Council (2012). Discipline-based education research: Understanding and improving learning in undergraduate science and engineering . Washington, DC: National Academies Press.

National Research Council (2013). Next generation science standards: For states, by states . Washington, DC: National Academies Press.

Nelson, C, Hartling, L, Campbell, S, Oswald, AE. (2012). The effects of audience response systems on learning outcomes in health professions education. A BEME systematic review: BEME guide no. 21. Medical Teacher , 34 (6), e386–e405.

Nicol, DJ, & Boyle, JT. (2003). Peer instruction versus class-wide discussion in large classes: A comparison of two interaction methods in the wired classroom. Studies in Higher Education , 28 (4), 457–473.

Norton, L, Richardson, TE, Hartley, J, Newstead, S, Mayes, J. (2005). Teachers’ beliefs and intentions concerning teaching in higher education. Higher Education , 50 (4), 537–571.

Pedaste, M, Mäeots, M, Siiman, LA, De Jong, T, Van Riesen, SA, Kamp, ET, … Tsourlidaki, E. (2015). Phases of inquiry-based learning: Definitions and the inquiry cycle. Educational Research Review , 14 , 47–61.

Penuel, WR, Abrahamson, L, Roschelle, J (2006). Theorizing the transformed classroom: sociocultural interpretation. In Audience response systems in higher education: Applications and cases , (p. 187).

Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education , 93 (3), 223–231.

Prince, M, Vigeant, M, Nottis, K. (2016). Repairing student misconceptions in heat transfer using inquiry-based activities. Chemical Engineering Education , 50 (1), 52–61.

Prosser, M, & Trigwell, K. (1993). Development of an approaches to teaching questionnaire. Research and Development in Higher Education , 15 , 468–473.

Rogers, EM (2003). Diffusion of innovations . New York: The Free Press.

Russ, RS, & Odden, TOB. (2017). Intertwining evidence-and model-based reasoning in physics sensemaking: An example from electrostatics. Physical Review Physics Education Research , 13 (2), 020105.

Schoenfeld, AH (1987). What’s all the fuss about metacognition? In AH Schoenfeld (Ed.), Cognitive science and mathematics education , (pp. 198–215). New Jersey: Lawrence Erlbaum Associates.

Schwab, JJ (1964). Structure of the disciplines: Meanings and significances. In GW Ford, L Pugno (Eds.), The structure of knowledge and the curriculum , (pp. 6–30). Chicago: Rand McNally.

Seymour, E (2005). Partners in innovation: Teaching assistants in college science courses . Lanham, MD: Rowman & Littlefield.

Speer, NM. (2008). Connecting beliefs and practices: A fine-grained analysis of a college mathematics teacher’s collections of beliefs and their relationship to his instructional practices. Cognition and Instruction , 26 (2), 218–267.

Tanenbaum, C. (2016). STEM 2026: A vision for innovation in STEM education.

Thompson, AG (1992). Teachers’ beliefs and conceptions: A synthesis of the research. In D Grouws (Ed.), Handbook of research on mathematics teaching and learning , (pp. 127–146). New York: National Council of Teachers of Mathematics. Macmillan.

Veenman, MVJ (2012). Metacognition in science education: Definitions, constituents, and their intricate relation with cognition. In A Zohar, YJ Dori (Eds.), Metacognition in science education , (pp. 21–36). Dordrecht: Springer-Verlag.

Vos, H, & De Graaff, E. (2004). Developing metacognition: A basis for active learning. European Journal of Engineering Education , 29 (4), 543–548.

Wheatley, GH. (1991). Constructivist perspectives on science and mathematics learning. Science Education , 75 (1), 9–21.

Wilson, CD, Taylor, JA, Kowalski, SM, Carlson, J. (2010). The relative effects and equity of inquiry-based and commonplace science teaching on students’ knowledge, reasoning, and argumentation. Journal of Research in Science Teaching , 47 (3), 276–301.

Zacharia, ZC, Manoli, C, Xenofontos, N, de Jong, T, Pedaste, M, van Riesen, SA, … Tsourlidaki, E. (2015). Identifying potential types of guidance for supporting student inquiry when using virtual and remote labs in science: A literature review. Educational Technology Research and Development , 63 (2), 257–302.

Zimmerman, C. (2000). The development of scientific reasoning skills. Developmental Review , 20 (1), 99–149.

Download references

Acknowledgements

The authors are grateful to RMC Research who conducted, audio-recorded, and transcribed the year 3 general reflective interviews, to Jana Bouwma-Gearhart who provided comments on an early version of the manuscript, and to the two instructors who kindly agreed to allow us insight into their teaching practice.

This work was conducted with support from the National Science Foundation under grant DUE 1347817. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation.

Availability of data and materials

The datasets generated and/or analyzed during the current study are not publicly available because this is still an active project and a public release would violate the terms of our IRB approval. Some parts of the data set are available from the corresponding author on reasonable request.

Author information

Authors and affiliations.

School of Chemical, Biological, and Environmental Engineering, Oregon State University, Corvallis, OR, 97331, USA

Milo Koretsky, Jessie Keeler & Ying Cao

College of Education, Oregon State University, Corvallis, OR, 97331, USA

  • John Ivanovitch

Author notes

John ivanovitch is deceased. this paper is dedicated to his memory..

You can also search for this author in PubMed   Google Scholar

Contributions

All authors made substantial contributions to the article and participated in the drafting of the article. All living authors read and approved the final manuscript.

Corresponding author

Correspondence to Milo Koretsky .

Ethics declarations

Ethics approval and consent to participate.

This study has been approved by the Institutional Review Board (IRB) at the authors’ institute (study # 6158).

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Cite this article.

Koretsky, M., Keeler, J., Ivanovitch, J. et al. The role of pedagogical tools in active learning: a case for sense-making. IJ STEM Ed 5 , 18 (2018). https://doi.org/10.1186/s40594-018-0116-5

Download citation

Received : 31 December 2017

Accepted : 22 March 2018

Published : 11 April 2018

DOI : https://doi.org/10.1186/s40594-018-0116-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Active learning
  • Audience response systems
  • Guided inquiry
  • Sense-making

articles about educational tools

articles about educational tools

Microsoft Education tools that supported learning in 2021

December 16, 2021.

By Microsoft Education Team

articles about educational tools

Share this article

Perhaps unlike ever before, learning in 2021 occurred in a wide variety of classroom settings. But no matter where it took place—remotely, in person, or in hybrid environments—schools and educators showed great flexibility as they worked to meet students’ academic and social-emotional needs. As we reflect on and celebrate the year, we’re inspired by how classrooms evolved and how educators used Microsoft Education tools, technology, and resources to adapt and support student learning.

Here are just a few of the new tools for learning and resources that supported classrooms this year:

Building community and skills among educators:

  • Professional development events: The year kicked off with BettFest 2021 , a free, three-day virtual event loaded with professional development opportunities and energizing keynotes from industry leaders.
  • Teacher trainings: Educators enjoyed an array of free Microsoft Store trainings to help them kick off back-to-school season even more prepared.
  • Educator networks:  Educators and schools continued to use our networking communities and programs to learn, innovate and successfully share valuable information amid the pandemic. We celebrated them when we welcomed the new, 2021-2022 class of Showcase Schools and MIE Experts .
  • Office Hours - for teachers!: Virtual Office Hours , designed by and for educators, provides teachers with quick, two to three-minute videos that explore how they can leverage Microsoft resources and tools to build a better classroom for tomorrow.

Supporting students, inclusivity and social-emotional learning 

Anti-racism education:.

Fostering inclusion in the next generation of students who will grow up to be citizens, advocates, and leaders of tomorrow starts in the classroom. Anti-racism journey for educators with students helped educators spark important conversations in their classrooms and create communities where all students are better seen and heard.

  • Helping students with reading: The introduction of Reading Progress , a Microsoft Teams tool that assists students in building their reading skills and fluency, helped build more confident learners.
  • Social emotional learning: Acknowledging and sharing feelings takes practice. The introduction of the Feelings Monster helped it become a daily routine for many students, leading the way to more supportive classrooms and expressive learners.
  • Student career pathing: Learning continues well beyond graduation, so we launched Career Coach —a tool helping students navigate their career journeys and apply new and existing skills in the workforce.

Using new devices and data-driven intelligence to optimize learning

  • New devices for emerging digital age: We introduced new Windows 11 SE devices , built with tools to optimize student learning experiences that equip students with tech for success in education’s emerging digital age.
  •  Data-driven intelligence about students: Education Insights in Teams—now automatically added to every class team—offered educators more ways to understand the individualized needs of each student through data and trend insights that inform personalized learning experiences.

There’s much to be excited about heading into 2022 as classrooms continue evolving and we all explore new ways to support student learning. As we meet new challenges, we will do it together. Microsoft Education has you covered with plenty of tools for learning in any environment.

Sign up for Microsoft Education’s monthly newsletter to stay in the loop with what’s new in the new year and beyond.

Related stories

articles about educational tools

New Teams features to help enhance student engagement

Dynamic learning environments engage students. They provide experiences and tools that help students connect with content, see relevance in their learning, and have multiple means of expressing their ideas.

articles about educational tools

Improving educator well-being and engagement using Teams

Now that the school year has begun, it’s important to have the right tools that can help you create a quality learning environment for students as well as educators. With these new Microsoft Teams features, you can better understand the well-being of educators, connect with parents and guardians to help manage student learning progress, and use various tools from the integration with our partner app, Kami.

articles about educational tools

Enhancing students’ social, emotional, and academic growth

In these challenging times, closing learning gaps and helping students catch up are top of mind for educators working to accelerate learning. But where does student well-being fit in with this? There’s a strong case to be made for focusing more heavily on assessing and addressing student emotional wellness.

  • SCHOOL STORIES
  • MICROSOFT EDUCATOR CENTER
  • CONTACT SALES

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Clin J Am Soc Nephrol
  • v.11(3); 2016 Mar 7

Logo of clinjasn

Educational Tools: Thinking Outside the Box

Majka woods.

* Office of Educational Development, University of Texas Medical Branch, Galveston, Texas; and

Mark E. Rosenberg

† Office of Medical Education, University of Minnesota Medical School, Minneapolis, Minnesota

The understanding, study, and use of educational tools and their application to the education of adults in professional fields are increasingly important. In this review, we have compiled a description of educational tools on the basis of the teaching and learning setting: the classroom, simulation center, hospital or clinic, and independent learning space. When available, examples of tools used in nephrology are provided. We emphasize that time should be taken to consider the goals of the educational activity and the type of learners and use the most appropriate tools needed to meet the goals. Constant reassessment of tools is important to discover innovation and reforms that improve teaching and learning.

Introduction

Educational tools, especially those related to technology, are populating the market faster than ever before ( 1 , 2 ). The transition to active learning approaches, with the learner more engaged in the process rather than passively taking in information, necessitates a variety of tools to help ensure success. As with most educational initiatives, time should be taken to consider the goals of the activity, the type of learners, and the tools needed to meet the goals. Constant reassessment of tools is important to discover innovation and reforms that improve teaching and learning ( 3 ).

Available resources and tools range from technology-driven solutions to strategies for creating more interactive and engaging learning opportunities. With the digital divide diminishing and a new era of technical literacy well under way, learners can be brought together in new and exciting ways ( 4 ). Many new tools focus on learning outside the confines of a traditional classroom, making them excellent resources for adult learners in professional settings ( 5 ). Increasingly, we find that technology-based solutions for learning allow individuals to determine their own educational path and achievement of competencies through innovative new platforms ( 6 ). Discussion boards, blogs, interactive exercises, simulations, visualization software, and multimedia software all encourage learners to manage their own learning.

Richard Mayer ( 7 ) developed the cognitive theory of multimedia learning in part to help us better understand how people learn from both words and pictures. Mayer ( 7 ) suggests that there are many factors involved in appropriate multimedia instruction and that there are three primary instructional goals in any learning environment that need to be addressed regardless of the type of teaching: ( 1 ) extraneous processing by the learner must be minimized, ( 2 ) essential processing by the learner needs to be managed appropriately, and ( 3 ) generative processing or meaningful learning is critical to all learning situations. Mayer ( 8 ) challenges us to think critically about multimedia instruction and its appropriate and effective use. His work ( 8 , 9 ) calls out the need to consider how people learn in simulated environments and with multimedia and how that translates to application of the activity ( 9 ). These guiding principles are critical as new tools are developed, used, and assessed. Technology should not be used in a vacuum of research and applied education theory and understanding. Using the guidelines above gives a way to think about the tools in this article and determine if they are helpful and the right ones for the learning environment that you want to establish.

In this article, we have compiled a short list of tools that can keep you thinking outside the box. The tools are organized on the basis of the setting where the teaching and learning take place, including classroom, simulation center, hospital and clinic, and independent learning. This provides both a framework for discussion and a roadmap ( Figure 1 ) for where to use the tools, with the caveat that many of them crossover from one setting to the next.

An external file that holds a picture, illustration, etc.
Object name is CJN.02570315f1.jpg

Roadmap of educational tools available in the different settings where medical education takes place . 3-D, three dimensional.

The classroom remains a common setting for medical education but one that is being transformed by new techniques and tools. Active learning is a model of instruction in which students engage with materials through a variety of methods, including reading, talking and listening, writing, problem solving, and reflecting ( 10 ). The responsibility of learning is focused on the students as opposed to the traditional passive lecture format. The flipped classroom is one large–group active learning technique and discussed below. Another form of active learning is problem-based learning (PBL), which will not be discussed, but the reader is referred to several excellent reviews ( 11 , 12 ). PBL focuses on applying material in context and integrates both basic science and clinical knowledge for solving complex problem sets. In PBL, a multipart clinical case is presented to a small group of students, who then work through the case using critical thinking applied knowledge skills. In this process, faculty act as facilitators (tutors) to support and guide the process.

Flipped Classroom

Flipped classroom is the popular name for a variety of pedagogies that flip the traditional roles of the class lecture and homework outside the class. In this technique, rote information is learned before class through a series of miniature lectures, and the class time is used for interactive learning ( 13 ). More than any other recent disruption, the flipped classroom has the power to turn learning experiences into higher–order thinking skills exercises, resulting in more critical thinking and richer, deeper learning experiences.

When the classroom is flipped, students are forced to take greater ownership by learning the material and then coming to the session ready to apply and evaluate what they know. There is no right or wrong way to flip, and there are many models available to fashion a flipped experience ( 14 ). Preliminary research indicates that attendance, learning, and perceived value of the education all increase with the flipped model ( 15 – 18 ).

The use of instructional videos to create replicable content delivery is a method that more and more educational settings are relying on for consistency, validity, efficiency, and cost effectiveness. Traditional video streaming and capture of long face-to-face lectures have shown relatively little retention and satisfaction for learners ( 19 ). However, the use of the new shortened topic–specific models have proven much more positive. The Khan Academy has produced an extensive series of high–quality short scripted video tutorials around medical topics that presents the information for students to learn, think about, and ultimately, apply in interactive scenarios ( 16 , 20 , 21 ). This is a disruptive innovation that early on allowed for nonclassroom intensive learning experiences driven by the student.

Audience Response System

Audience response systems (ARSs) are tools to facilitate active learning that can increase audience participation in the classroom and assess learning in real time ( 22 , 23 ). ARSs allow the teacher to gauge where the learners are as a whole, what areas they may be struggling with, and in turn, what issues need to be addressed immediately. This is especially useful when there are a variety of learners in the classroom and there is a need to have learners on the same page to discuss larger or more in–depth areas of content.

Boscardin and Penuel ( 24 ) performed a review of the literature examining the efficacy of ARSs and concluded that ARSs increase learner engagement but do not uniformly improve learning, although there were confounding factors that made interpretation of some of the articles difficult. Other studies have shown improved learning and knowledge retention with use of ARSs ( 25 , 26 ).

The technology behind ARSs has advanced significantly, with older hardware–based devices, such as classroom clickers, being replaced by software/Cloud–based audience response using mobile phones, Twitter, or web browsers (for example, www.polleverywhere.com ). Other programs, such as TodaysMeet ( www.todaysmeet.com ), provide a backchannel chat platform for teachers and learners that can be displayed in real time during the presentation to allow multilocated learners to comment or ask questions. This tool expands the classroom community and empowers learners to be more interactive.

Virtual Classroom

Virtual classrooms are platforms that allow students to receive instruction from a qualified teacher in an interactive environment using a variety of communication technologies. Software, such as WebEx, GoToMeeting, GoToTraining, or Adobe Connect, can all enable virtual classrooms. These tools allow communication through webcams, microphones, and real-time chat. Virtual classrooms allow participants to see instructional material, raise their hands to comment and ask questions, and take polls and quizzes from anywhere in the world. Learners can also screencast and add to whiteboard presentations.

Software, such as Voicethread ( www.voicethread.com ) and Zaption ( www.zaption.com ), facilitates interactions in a virtual classroom setting. Voicethread is an interactive presentation software that allows the user to upload different types of media (video, PDFs, and pictures) into a single presentation format, and then, voice threads are created (asynchronously) on the presentation ( 27 ). Learners can be involved in a dialogue among themselves, with the faculty, and/or with a combination of the two. The power in this model is the ability to create cohorts of learners who interact on common questions, case scenarios, and content in an asynchronous environment.

With Zaption, content is presented with a video, and then, the teacher can drop in ways to engage with the content, turning a passive watching activity into an active learning activity. Learners not only watch the video but also, are directed to engage in a variety of real–time active problem–solving and reflection activities that correspond with the content. Zaption can be used on computers and mobile devices as well, making it an interesting partner to the flipped classroom. This gives a home for content to live before the discussions and forces learners to engage and reflect on the content and be prepared to discuss it in an applied manner.

Digital Badges

Digital badges are a recent concept designed to recognize and validate additional training or activities beyond the usual educational accomplishments and may be used to manage content criteria and assessment ( 28 ). Digital badges are an electronic certification of accomplishment and in concept, similar to physical badges used by organizations, such as the Boy Scouts of America and the military. They are digital tokens that appear as an icon or logo, and they are stored electronically in a badge repository, such as Mozilla–hosted Badge Backpack ( http://openbadges.org ), Credly ( www.credly.com ), or BadgeOS ( www.badgeos.org ).

Metadata are contained in the different layers of the badge, including badge name, issuer’s and recipient’s information, criteria for the badge, description of the badge, issue date, expiration date, and a link to evidence supporting granting of the badge ( Figure 2 ). Digital badges can be accessible from the recipient’s curriculum vitae, Dean’s letter, ePortfolio, or signature line and can be placed on social media sites, such as Facebook, Google+, and LinkedIn.

An external file that holds a picture, illustration, etc.
Object name is CJN.02570315f2.jpg

Anatomy of a digital badge. Schematic representation of a digital badge consisting of a badge image and various metadata about the learner’s accomplishment. Reprinted from classhack.com , with permission.

When used to their full capacity, digital badges are a means to help learners self-identify areas of strength and improvement, provide feedback on specific knowledge, skills, and attitudes, and track progress toward competence and mastery. Because the learners are responsible for collecting and presenting evidence of their progress, this tool may be useful for developing skills in lifelong learning and mastery of information. For adult learners in particular, being the owner of the learning can be very motivating and may even improve learning outcomes.

Digital badges are being used by the Khan Academy and in higher education ( 29 ). Boston University School of Medicine has a medical education badge program (BUSM+) to learn the fundamentals of teaching and learning ( 30 ).

Simulation Center

Simulation-based learning.

Simulation–based medical education (SBME) is a common method to teach technical and nontechnical skills and test competencies ( Figure 3 ). This method is based on the premise that making and learning from mistakes are powerful ways to learn. SBME takes multiple forms and can be performed by individuals and teams ( 31 , 32 ). A number of tools exists for SBME. Mannequins are lifelike full or partial body models used for a variety of simulation exercises. Partial task trainers are anatomic models that resemble a portion of the body and are used to practice one specific skill. Virtual reality simulators, such as LapSim, are used for practicing laparoscopic procedural skills. SBME can also involve use of virtual patients and virtual hospitals or can occur in situ , a setting that is especially conducive to interprofessional team training ( 33 ). Advances in technology allow realistic and even patient–specific three–dimensional models to be constructed and used to create interactive virtual models for teaching anatomy and learning procedures, such as kidney biopsy ( Figure 4 ).

An external file that holds a picture, illustration, etc.
Object name is CJN.02570315f3.jpg

Simulation-based learning. A trainee performing echocardiography on a mannequin in a simulated environment and being observed by a faculty member. Reprinted from SimPORTAL, University of Minnesota and Spicy-Meatball Photography (Joe Vruno).

An external file that holds a picture, illustration, etc.
Object name is CJN.02570315f4.jpg

Model of kidney anatomy for simulation. This is an example of a three-dimensional model of kidney anatomy that can be constructed and used to create interactive virtual models for teaching anatomy and learning procedures, such as kidney biopsy. Patient-specific models can be created using Digital Imaging and Communications in Medicine standards, which can then be converted into meshes and subsequently, three-dimensionally printed. Reprinted from Daniel Burke and Dr. Robert Sweet, SimPORTAL, University of Minnesota.

In 2010, McGaghie et al. ( 34 ) performed a critical review of SBME research and concluded that the work in SBME was promising but just beginning and that it required additional research and review. McGaghie et al. ( 34 ) cited 12 critical traits of SBME that still guide the ideas behind good simulation tools today, ranging from simulation as a feedback source to thinking about context in learning. Since that time, a considerable amount of work has begun to show up on the importance and efficacy of simulation-based models of education.

Additional research in specialties, including nephrology, shows that, although initial training on simulators may show improved skills or capability in the learning, over time, there is a fairly significant decline, indicating that perhaps the strength of simulation is in ongoing training and practice, where all levels of learners return periodically to test and improve themselves ( 35 ). Okuda et al. ( 36 ) summed it up nicely by concluding that most medical students, residents, and fellows will now do a fairly significant amount of training on simulators but relatively little data exist on the ultimate outcome of this work on patient outcomes.

Active learning through games for education and training are referred to as serious games to contrast them with more conventional video games ( 37 – 39 ). Serious games can be used in both the simulation center and for independent learning, and they tend to be more learner-centered, interactive, and engaging than traditional teaching models. Serious games vary in complexity and technology requirements and in medical education, are often used to teach technical skills, team work (often with an ultimate goal of improving quality and patient safety), and complex decision making and excite students about a medical topic ( 38 , 39 ). The development of games for medical education should define a clear goal for learner outcome, the rules for playing the game, and a method for players to track their progress toward the goal. Games should focus on the learning objectives and not on the technology.

A consensus-based framework for the assessment of medical serious games was developed that provides 62 items in five main themes (game description, rationale, functionality, validity, and data protection) ( 37 ). The effectiveness of gaming as a learning tool needs additional study ( 40 – 42 ). Graafland et al. ( 38 ) did a systematic review of serious games for medical education and surgical skills training and identified 25 articles describing 30 serious games. Of the games identified, none had a full validation process performed.

In nephrology, a game-based format has been used to teach management of transplant patients ( 43 ), phase-contrast microscopy of urine ( 44 ), and important concepts in nephrology ( 45 ). A resource for those interested in more information is the Journal of Medical Internet Research Serious Games ( www.jmir.org ), which is devoted to computer/web applications that incorporate elements of gaming to solve serious medical and health problems.

Hospital or Clinic

Much of medical education takes place in hospitals and clinics, in hallways or workrooms of hospital wards, at the patient’s bedside, in the operating room, or in an outpatient clinic room. Tools are available to facilitate and enhance the learning experiences in these settings.

Point of Care Resources

Many resources are available for the learner or provider to access answers to clinical questions at the point of care. The widespread availability and use of mobile devices, including smartphones and tablets, enable the use of point of care resources ( 46 ). UpToDate, originally developed by Burton Rose in 1991, was one of the first and most used online resources designed to provide evidence–based decision support at the point of care ( www.uptodate.com ). The initial focus of UpToDate was nephrology, but it has now expanded to 22 clinical specialties, is used by health care providers all over the world, and can be accessed by both computer and mobile devices. An association between use of UpToDate and improved patient outcomes has been reported ( 47 ).

Medically related applications are widely available and can be downloaded onto smart phones, iPads, and other devices. Many of these can be used as learning tools and readily available sources of point of care information. Applications can provide differential diagnosis, treatment algorithms, drug-dosing information, antibiotic choices, and medical calculators. Nephrology applications are available, including Nephrology On-Demand Plus and Nephrology Tool by Epocrates. Standards and a framework have been proposed for assessing digital health applications but are currently not widely adapted ( 48 , 49 ).

Tablets as Whiteboards

Teaching on the hospital ward or clinic is a way to increase learner engagement and excitement around clinically relevant topics. iPads can be used as a way of modernizing these clinical chalk talks. Applications are available to turn iPads into digital whiteboards for on the spot teaching and collaboration ( 50 ). For example, on BaiBoard (Lightplaces Limited; free), a cloud–based collaborative whiteboard application (currently available on iPads and Mac), a hallway presentation can be done from either a blank iPad screen (canvas) or one preloaded with graphics, pictures, PDFs, or movies. Graphics, text, drawing, PDFs, or images can be added during the presentation using a wide range of tools, including importing from Dropbox, Google Drive, or the web. The presentation can be shared on other participants’ screens or sent to a classroom screen and may include offsite individuals who can connect with an IP address to the meeting in a secure manner. Many other whiteboard applications are also available ( Table 1 ).

Summary of online educational tools available for collaboration and presentation

Wearable Technology

Google Glass is a wearable computer consisting of a camera, processor, and small computer screen mounted to the edge of eyeglass frames that can search the internet, take photos, record video and audio, and thereby, provide a first person point of view ( 51 , 52 ). For medical education, Google Glass has been used as a tool to facilitate the teaching of surgical techniques, for recording simulated patient encounters, and for teaching anatomy ( 53 , 54 ). Although no longer available through the official Google Play Store, there are rumors that it may arise again as an enterprise and not a consumer product. Work is ongoing in the use of newer technology for imaging procedures with the goal of improving surgical training. Surgeons have recorded operations using Oculus Rift, a virtual reality headset to provide a fully immersive three–dimensional experience ( 55 ). The potential application of wearable technology in health care and medical education remains intriguing.

Independent Learning

Independent learning remains a lifelong skill required of all physicians. With the explosion of medical knowledge and resources for learning, it is important for health professionals to develop their own strategies for keeping up to date with the latest development in basic and clinical science. This section will summarize a number of tools available for independent learning.

An e-textbook is best described as a digital version of a physical textbook often used for a specific course at an educational institution. e-Textbooks are downloadable to laptop, tablet, or smartphone and include searching capability, personal highlighting, annotation, linking, and possible inclusion of audio/video or interactive media. An example of a new e-textbook in nephrology is the recently published Chronic Renal Disease , which is available in both print and as an e-textbook ( 56 ). The book has a companion website with downloadable figures either as PDFs or PowerPoint slides along with an interactive test bank ( 57 ).

Social Media and e-Learning

The general term e-learning is most often used to describe learning conducted through electronic media, most commonly outside of the classroom and delivered online using the Internet. e-Learning can be synchronous, occurring in real time with all learners participating and interacting at the same time, or asynchronous, with individual learners proceeding at their own pace. Social media is the term used to describe online resources that can be used for information access and exchange and collaboration. Social media are increasingly being used for e-learning and may be better suited to keep up with the rapid pace of new knowledge generation and dissemination and the hunger for anytime, anywhere learning. A unique feature of social media is that it provides opportunities for rapid interaction among a community of users. Tools commonly used for social media include websites and applications dedicated to forum or discussion boards ( e.g. , Usenet), blogging ( e.g. , WordPress, TypePad, and Blogger), microblogging ( e.g. , Twitter), social networking ( e.g. , Facebook, LinkedIn, Doximity, and Google+), social bookmarking ( e.g. , Flickr), social curation ( e.g. , Reddit and Pinterest), and wikis ( e.g. , Wikipedia).

Twitter is a microblogging platform widely used across medical education. Users can receive and send short (140 characters) messages called tweets. Twitter also provides opportunities for chats that are real–time synchronous learning opportunities to share ideas and communicate about specific topics, often with a facilitator moderating the discussion ( e.g. , Twitter journal clubs).

The term free open access meducation (medical education; FOAM; #FOAMed is the Twitter hash tag) has been used to describe the use of social media characterized by a crowd-sourced collection of online content that can be in multiple formats, including blogs, podcasts, tweets, Google hangouts, discussions, videos, text documents, photographs, Facebook groups, and other content ( 58 ). FOAM has been most fully adapted by emergency medicine and critical care. The content is meant to be constantly evolving, collaborative, interactive, and personalized to meet the requirements of the individual user. Content can be easily accessed from any device that can connect to the Internet ( 59 ).

Use of social media in medical education has been associated with improved knowledge, attitudes, and skills, but there are very few randomized controlled trials ( 59 ). A number of barriers and challenges to the use of social media includes technical issues, variable learner participation, distracted state of users, and privacy concerns ( 59 ). Other issues include the randomness of discussions, the quality of material, and the absence of peer review before information is posted. Generational differences and the degree of embracement of social media often exist between learners and educators (noting that generational differences more likely contribute to this than educational expertise). Kind et al. ( 60 ) have provided some useful tips for the use of social media by medical educators.

Nephrology Websites and Blogs

There are various websites that compile nephrology educational material ( Table 2 ). These often contain lectures by experts in the field, images, videos, links to clinical practice guidelines, news reporting, a place for sharing ideas and networking, and a repository for tools that can be useful for both nephrology education and clinical practice.

Selected social media resources for nephrology education

AJKD , American Journal of Kidney Diseases ; KDIGO, Kidney Disease Improving Global Outcomes; CJASN , Clinical Journal of the American Society of Nephrology ; NephJC, Nephrology Journal Club.

Not all emerging tools and educational innovations will be successfully adopted ( 61 ). The hype cycle provides a framework for thinking about the life of certain innovations. It is an analytical model developed by the information technology research and advisory firm Gartner Incorporated ( www.gartner.com ) ( Figure 5 ) to conceptualize where the emerging technology is in the developmental continuum. It describes five phases representing the development, adoption, and spread of specific technologies.

An external file that holds a picture, illustration, etc.
Object name is CJN.02570315f5.jpg

The hype cycle. Hype cycle analytical model developed by the information technology research and advisory firm Gartner Incorporated. The five key phases of a technology’s lifecycle are illustrated. The concept has been applied to emerging technology in education (details in the text), from Jeremykemp at English Wikipedia.

This model has been applied to education with the Hype Cycle for Education tool developed at the University of Minnesota. This is community-sourced information about technology in higher education ( 62 ). The tool allows users to learn and monitor new academic technologies, share experiences with these technologies, and innovate by adopting new technologies or applying new techniques. The ultimate goal is to improve strategic decision making regarding financial and time investments in technology.

The understanding, study, and use of educational tools, their place in the hype cycle, and their application to education of adults in professional fields are increasingly important. With a millennial generation (born 1985–2000) who are more technologically literate than any previous generation, are more attuned to the need for active learning, and have more experience in collaborative work environments, we are being forced to reassess not only what we teach but how we provide the information. Having a solid understanding of the landscape of educational technology is equally important as understanding and use of any specific tools. With a market that is continuously creating new and improved tools, keeping a focus on the tools’ intended use is imperative.

Of the many tools described in this review, some are more likely than others to survive the hype cycle and become a part of the educators’ toolbox, whereas others are likely to be replaced by other educational innovations. It is difficult to predict which tools will stay and which will go, but many are already showing signs that they are not just passing fads. Active learning is becoming firmly established, even to the point of medical schools designing and building classrooms suitable for this form of pedagogy. This will involve greater use of the flipped and virtual classrooms, including the use of short videos and ARSs. Simulation and the technology and tools related to it are also becoming established, especially with greater health system focus on quality improvement and patient safety. There will also continue to be expansion in the use of social media to facilitate communication between learners and educators and as a way to stay updated on the latest developments. The hype cycle exists on several levels: within departments, within schools, within institutions, and within the larger world of education as a whole. Each of these spheres influences the innovations that will stay and to what degree they will be imbedded in the culture of medical education. Those that show sustained staying power tend to permeate more of the spheres of influence, tend to generate a need for deeper understanding that leads to further scholarship and research, and are applicable in multiple settings.

Adult learners will continue to expect new tools with which to learn, and they will also be some of the creators, improvers, and in some cases, rejecters. Both faculty and learners must have some capacity to try new tools and teaching techniques. This requires a culture of permission to try and not always succeed with technology, time to work with the technology, and understanding of what technology fits best within the pedagogy and discipline, and it also requires financial capital and occasionally, space to store and work with the technology.

In a review such as this, there are many excellent tools, social media sites, and other resources that have inadvertently been left out or that have been developed during the writing and publishing of the article. We suggest using #mededtool on Twitter to share your favorite tool, comment on exclusions, or discuss new tools. Hashtag (#) is a way to categorize messages and is useful to mark a topic in a tweet that can then be searched.

There is no single solution in moving forward with innovative teaching and learning techniques. Technology comes and goes, learners come in all types as do faculty, and time and money are almost always at odds with the day to day operations. However, taking a step and trying out even one new technique, technology, or training experience can create a huge step in a new direction to improving patient care even more dramatically over the next decade.

Disclosures

M.E.R. is the coeditor of the textbook Chronic Renal Disease that is discussed in the article.

Acknowledgments

The content is the responsibility of the authors and does not reflect or represent the views of the American Society of Nephrology, for which M.E.R. serves as a councilor.

Published online ahead of print. Publication date available at www.cjasn.org .

More Teachers Are Using AI-Detection Tools. Here’s Why That Might Be a Problem

articles about educational tools

  • Share article

As ChatGPT and similar technologies have gained prominence in middle and high school classrooms, so, too, have AI-detection tools. The majority of teachers have used an AI-detection program to assess whether a student’s work was completed with the assistance of generative AI, according to a new survey of educators by the Center for Democracy & Technology . And students are increasingly getting disciplined for using generative AI.

But while detection software can help overwhelmed teachers feel like they are staying one step ahead of their students, there is a catch: AI detection tools are imperfect, said Victor Lee, an associate professor of learning sciences and technology design and STEM education at the Stanford Graduate School of Education.

“They are fallible, you can work around them,” he said. “And there is a serious harm risk associated in that an incorrect accusation is a very serious accusation to make.”

A false positive from an AI-detection tool is a scary prospect for many students, said Soumil Goyal, a senior at an International Baccalaureate high school in Houston.

“For example, my teacher might say, ‘In my previous class I had six students come up through the AI-detection test,’” he said, although he’s unsure if this is true or if his teachers might be using this as a scare tactic. “If I was ever faced with a teacher, and in his mind he is 100 percent certain that I did use AI even though I didn’t, that’s a tough scenario. [...] It can be very harmful to the student.”

Schools are adapting to growing AI use but concerns remain

In general, the survey by the Center for Democracy & Technology, a nonprofit organization that aims to shape technology policy, with an emphasis on protecting consumer rights, finds that generative AI products are becoming more a part of teachers’ and students’ daily lives, and schools are adjusting to that new reality. The survey included a nationally representative sample of 460 6th through 12th grade public school teachers in December of last year.

Most teachers—59 percent—believe their students are using generative AI products for school purposes. Meanwhile, 83 percent of teachers say they have used ChatGPT or similar products for personal or school use, representing a 32 percentage point increase since the Center for Democracy & Technology surveyed teachers last year.

The survey also found that schools are adapting to this new technology. More than 8 in 10 teachers say their schools now have policies either that outline whether generative AI tools are permitted or banned and that they have had training on those policies, a drastic change from last year when many schools were still scrambling to figure out a response to a technology that can write essays and solve complex math problems for students.

And nearly three-quarters of teachers say their schools have asked them for input on developing policies and procedures around students’ use of generative AI.

Overall, teachers gave their schools good marks when it comes to responding to the challenges created by students using generative AI—73 percent of teachers said their school and district are doing a good job.

That’s the good news, but the survey data reveals some troubling trends as well.

Far fewer teachers report receiving training on appropriate student use of AI and how teachers should respond if they think students are abusing the technology.

  • Twenty-eight percent of teachers said they have received guidance on how to respond if they think a student is using ChatGPT;
  • Thirty-seven percent said they have received guidance on what responsible student use of generative AI technologies looks like;
  • Thirty-seven percent also say they have not received guidance on how to detect whether students are using generative AI in their school assignments;
  • And 78 percent said their school sanctions the use of AI detection tools.

Only a quarter of teachers said they are “very effective” at discerning whether assignments were written by their students or by an AI tool. Half of teachers say generative AI has made them more distrustful that students’ schoolwork is actually their own.

A lack of training coupled with a lack of faith in students’ work products may explain why teachers are reporting that students are increasingly being punished for using generative AI in their assignments, even as schools are permitting more student use of AI, the report said.

Taken together, this makes the fact that so many teachers are using AI detection software—68 percent, up substantially from last year—concerning, the report said.

“Teachers are becoming reliant on AI content-detection tools, which is problematic given that research shows these tools are not consistently effective at differentiating between AI-generated and human-written text,” the report said. “This is especially concerning given the concurrent increase in student disciplinary action.”

Simply confronting students with the accusation that they used AI can lead to punishment, the report found. Forty percent of teachers said that a student got in trouble for how they reacted when a teacher or principal approached them about misusing AI.

What role should AI detectors play in schools’ fight against cheating?

Schools should critically examine the role of AI-detection software in policing students’ use of generative AI, said Lee, the professor from Stanford.

“The comfort level we have about what is an acceptable error rate is a loaded question—would we accept one percent of students being incorrectly labeled or accused? That’s still a lot of students,” he said.

A false accusation could carry wide-ranging consequences.

“It could put a label on a student that could have longer term effects on the students’ standing or disciplinary record,” he said. “It could also alienate them from school, because if it was not AI produced text, and they wrote it and were told it’s bad, that is not a very affirming message.”

Additionally, some research has found that AI detection tools are more likely to falsely identify English learners’ writing as produced by AI.

Low-income students may also be more likely to get in trouble for using AI, the CDT report said because they are more likely to use school-issued devices. Nearly half the teachers in the survey agree that students who use school-provided devices are more likely to get in trouble for using generative AI.

The report notes that students in special education use generative AI more often than their peers and special education teachers are more likely to say they use AI-detection tools regularly.

Research is also finding that there are ways to trick AI detection systems, said Lee. And schools need to think about the tradeoffs in time and resources of keeping abreast with inevitable developments both in AI, AI-detection tools , and students’ skills at getting around those tools.

Lee said he sees why detection tools would be attractive to overwhelmed teachers. But he doesn’t think that AI detection tools should alone determine whether a student is improperly using AI to do their schoolwork. It could be one data point among several used to determine whether students are breaking any—what should be clearly defined—rules.

In Poland, Maine, Shawn Vincent is the principal of the Bruce Whittier middle school, serving about 200 students. He said that he hasn’t had too many problems with students using generative AI programs to cheat. Teachers have used AI-detection tools as a check on their gut instincts when they have suspicions that a student has improperly used generative AI.

“For example, we had a teacher recently who had students writing paragraphs about Supreme Court cases, and a student used AI to generate answers to the questions,” he said. “For her, it did not match what she had seen from the student in the past, so she went online to use one of the tools that are available to check for AI usage. That’s what she used as her decider.”

When the teacher approached the student, Vincent said, the student admitted to using a generative AI tool to write the answers.

Teachers are also meeting the challenge by changing their approaches to assigning schoolwork, such as requiring students to write essays by hand in class, Vincent said. And although he’s unsure about how to formulate policies to address students’ AI use, he wants to approach the issue first as a learning opportunity.

“These are middle school kids. They are learning about a lot of things this time in their life. So we try to use it as an educational opportunity,” he said. “I think we are all learning about AI together.”

Speaking from a robotics competition in Houston , Goyal, the high school student from Houston, said that sometimes he and his friends trade ideas for tricking AI-detection systems, although he said he doesn’t use ChatGPT to do the bulk of his assignments. When he uses it, it’s to generate ideas or check grammar, he said.

Goyal, who wants to work in robotics when he graduates from college, worries that some of his teachers don’t really understand how AI detection tools work and that they may be putting too much trust in the technology.

“The school systems should educate their teachers that their AI-detection tool is not a plagiarism detector [...] that can give you a direct link to what was plagiarized from,” he said. “It’s also a little bit like a hypocrisy: The teachers will say: Don’t use AI because it is very inaccurate and it will make up things. But then they use AI to detect AI.”

Sign Up for EdWeek Tech Leader

Edweek top school jobs.

A 3d render of an abstract staircase and a glowing portal with a woman going into the portal.

Sign Up & Sign In

module image 9

  • Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

The University of Texas at Austin

April 8, 2024 , Filed Under: Uncategorized

Innovative Tools for 21st Century Learning

The landscape of education is rapidly evolving in the 21st century. With the advent of new technologies, the tools and methods used in learning environments have transformed drastically. This article takes a closer look at some innovative tools that are reshaping the ways in which we engage with information, collaborate, and learn in more effective and exciting ways. These tools, devoid of commercial taint, offer a peek into the future of learning.

The Digital Classroom

In the digital age, the traditional classroom setting is being augmented with virtual learning environments . These platforms facilitate a blend of in-person and remote learning opportunities, allowing students to interact with their peers and instructors in dynamic ways. From video conferencing to shared digital whiteboards, the digital classroom breaks down geographical barriers to education.

Immersive Learning with Virtual Reality

Virtual reality (VR) technology has opened up new frontiers in education, enabling immersive learning experiences that were once the stuff of science fiction. By simulating real-world environments or historical events, VR allows students to explore and interact with their subject matter in a hands-on manner, significantly enhancing understanding and retention.

Collaborative Tools for Group Projects

Collaboration is key in modern education. New tools have made it simpler for students and educators to work together on projects, regardless of their physical location. Shared document platforms, task management apps, and real-time editing capabilities ensure that collaborative work is seamless and productive.

Among these collaborative tools, the wiki offers a unique platform for sharing and building knowledge collaboratively. It empowers students and educators to curate content collectively, facilitating a deeper understanding of subjects through communal effort and insights.

Adaptive Assessment Tools

Adaptive assessment tools are reshaping the way educators evaluate student understanding and skills. These tools adjust the difficulty of questions based on the learner’s responses in real-time, providing a more accurate measurement of knowledge and abilities. This approach not only improves the assessment process but also helps in identifying areas where students may need additional support or resources.

Gamification of Education

Turning learning into a game is an approach that has gained momentum in the 21st century. Gamification incorporates game design elements in educational settings, making learning processes more engaging and fun. The competitive and rewarding nature of games motivates students to achieve their educational goals while enjoying the process.

AI and Personalized Learning

Artificial intelligence (AI) is revolutionizing the educational field by providing personalized learning experiences to students. AI can tailor educational content to meet the individual needs of learners, adapting in real time based on their progress. This personalized approach ensures that students receive instruction and practice where they need it most, making the learning process more efficient and effective.

Open Educational Resources (OER)

Open Educational Resources are freely accessible, openly licensed documents and media that are useful for teaching, learning, and research. OERs have surged in popularity, offering a wealth of diverse learning materials to educators and learners worldwide. This democratizes access to education and encourages a culture of sharing and collaboration among the educational community.

Blockchain in Education

The adoption of blockchain technology in education is offering novel solutions to longstanding issues like credential verification and the secure sharing of academic records. By utilizing a decentralized and immutable ledger, educational institutions can provide stakeholders with a tamper-proof record of achievements, facilitating seamless transitions between different levels of education and the workforce.

Sustainable Learning Environments

With a growing emphasis on sustainability , educational institutions are incorporating eco-friendly practices and resources into their curricula and infrastructures. This includes the use of digital textbooks to reduce paper waste, implementing green building designs, and integrating sustainable development goals into lesson plans. Such initiatives educate students on the importance of environmental stewardship while reducing the carbon footprint of educational facilities.

Final Thoughts

The innovation in educational tools is a testament to the evolving needs of 21st-century learners and educators. As technology advances, so too does the potential for these tools to transform the educational landscape. While not an exhaustive list, the examples provided highlight the breadth and depth of options available for enhancing learning processes in exciting and novel ways. The future of education is bright, with technology paving the way for more engaging, inclusive, and effective learning environments.

The real voyage of discovery consists not in seeking new landscapes, but in having new eyes —  Marcel Proust

Responsive Menu Pro Header Bar Logo

Best Free Formative Assessment Tools for Teachers

The best free formative assessment tools can help teachers track student progress and personalize learning.

free formative assessment tools

Recent updates

This article was updated in April 2024.

Formative assessments are crucial for educators to understand their students’ grasp of concepts and skills as they work their way through lessons. With this understanding, educators can better direct learners to spend more time practicing and gaining mastery of topics with which they struggle.

The following free assessment tools and apps are some of the best ones for gauging student progress at any point in the curriculum. Most make it easy to sign up for a free account with Google or other popular platforms. And although most of these tools are “freemium,” several are 100% free for educators. 

Best Free Formative Assessment Tools for Teachers 

Create Formative Assessments with an AI Chatbot With one simple prompt template, teachers can create multiple formative assessments across the curriculum. To make your formative assessment even more specific, tailor prompts to your desired specifications. Tech & Learning’s Best Free AI Quiz Generators details the pros and cons of various chatbots when creating assessments. 

iCivics Assessments The nonprofit iCivics platform is not only a free social studies lesson creation and planning tool but also a robust repository for formative assessments. Educators can simply create a free account, then click Teach >Tags>Assessments. Search for assessments filtered by grade level, type, standards, topic, and more. Each assessment is linked to lessons and extension activities.

NoRedInk A complete literacy curriculum designed to help teachers foster strong writing skills, NoRedInk’s free account allows formative assessments covering a variety of topics, from clarity and style to SAT skills. Teachers can create classes and assign quizzes through the platform. 

Woolclap A fun site for creating and sharing interactive presentations and quizzes of various types, including word clouds, multiple choice, polls, open-ended, and more. Although the free account only allows two questions per event/quiz, users can create unlimited events for up to 1,000 participants, and present real-time answers to the class.  

ASSISTments Formative Math Assessments Created by middle school math teachers in 2003, ASSISTments is a nonprofit that provides a fully free math assessment platform dedicated to the idea that high-quality formative assessments are the key to learning. Features include integration with Canvas and Google Classroom, strong professional learning resources, and an educator community forum. Assessments are tied to Common Core State Standards. 

Tech & Learning Newsletter

Tools and ideas to transform education. Sign up below.

Nearpod Highly popular with teachers, Nearpod lets users create original multimedia assessments or select from a 15,000+ library of pre-made interactive content. Choose from polls, multiple-choice, open-ended questions, draw-its, and gamified quizzes. Free silver plan provides 40 students per session, 100 mb storage, and access to formative assessment and interactive lessons. 

Pear Deck Pear Deck, an add-on for Google Slides, allows educators to quickly create formative assessments from flexible templates, turning an ordinary slideshow into an interactive quiz. Free accounts provide lesson creation, Google and Microsoft integration, templates, and more.

PlayPosit The web- and Chrome-based Playposit platform provides customizable interactive video assessments, helping teachers accurately gauge their students’ mastery of video-based content. Free Classroom Basic account includes templates, free premade content, and 100 free learner attempts per month.

Flip This simple-to-use, powerful, and fully free learning tool allows teachers to initiate class discussions by posting videos. Students then create and post their own video response, adding enhancements such as emojis, stickers, and text. 

Formative Educators upload their own learning content, which the platform automatically transforms into assessments, or choose from the outstanding Formative library. Students respond on their own devices via text or drawing, continuously updated in real time on the teacher’s screen. Free basic account for one teacher offers unlimited Formatives, real-time student response, basic grading tools, feedback, and Google Classroom integration. 

Padlet Padlet’s seemingly simple framework— a blank digital “wall”—belies its robust capabilities in assessment, communication, and collaboration. Drag and drop almost any file type to the blank Padlet to share assessments, lessons, or presentations. Students respond with text, photos, or video. Free basic plan includes three Padlets at one time.

Socrative This super-engaging platform allows teachers to create polls and gamified quizzes to assess student progress, with real-time results visible on screen. Socrative’s free plan permits one public room with up to 50 students, on-the-fly questions, and Space Race assessment.

Google Forms One of the simplest and easiest ways to create and share formative assessments. Create video quizzes, multiple-choice, or short answer questions quickly. Link the Google Form to a Google Sheet in order to analyze responses. Before you share your quiz, be sure to check out 5 Ways to Prevent Cheating on Your Google Form Quiz .

Quizlet Quizlet’s vast database of multimedia study sets includes a variety ideal for formative assessment, from flashcards to multiple-choice quizzes, to the asteroid game Gravity. Free for basic features; the premium Quizlet Plus account allows for customization and tracking student progress. 30-day free trial, then $35.99 annually for teachers.

Edpuzzle Edpuzzle’s video-based learning and assessment platform helps educators turn one-way videos into interactive formative assessments. Upload videos from YouTube, TED, Vimeo, or your own computer, then add questions, links, or images to create meaningful evaluations. Free basic accounts for teachers and students allow interactive lesson creation, access to millions of videos, and storage space for 20 videos. 

  • Best Sites for Creating Quizzes
  • Gimkit: How to Use It for Teaching
  • It’s Counterintuitive But Pretesting Consistently Works, Says Research

Diana Restifo

Diana has been Tech & Learning's web editor and contributor since 2010, dedicated to ferreting out the best free tech tools for teachers.

Microsoft Copilot: How To Use It To Teach

Edtech Show & Tell: April 2024

gotFeedback: How To Use It To Teach

Most Popular

By Erik Ofgang 8 March 2024

By Luke Edwards 6 March 2024

By Luke Edwards 5 March 2024

By Erik Ofgang 4 March 2024

By Luke Edwards 28 February 2024

By Sascha Zuger 27 February 2024

By Stephanie Smith Budhai, Ph.D. 26 February 2024

By Luke Edwards 23 February 2024

By Luke Edwards 20 February 2024

By Steve Baule 20 February 2024

By Erik Ofgang 15 February 2024

articles about educational tools

Education Tools

Education portal for digital productivity online

articles about educational tools

Almanack: Effortlessly Create Resources That Meet Every Student’s Needs

For educators, teachers and coaches they do understand the challenges of catering to the diverse needs of every student in their classroom. They strive to…

articles about educational tools

Formative AI: Generate Engaging Educational Materials in Seconds – Automate Lesson Planning & Save Time

Imagine the never-ending stack of lesson plans looming over your desk, each one demanding hours of painstaking research, writing, and revision.  You long for captivating…

articles about educational tools

Teachify: Generate Personalized Assignments, Adapt to Student Progress with AI

Teachers are continually seeking innovative solutions to streamline their workflow, enhance student engagement, and personalize learning experiences.  With advancements in artificial intelligence (AI), educators now…

articles about educational tools

CourseMind: Automate Your Teaching & Reclaim Your Time! (Educator’s Dream!)

Are you an educator overwhelmed by lesson planning, grading, and the endless list of tasks that steal your time?  The passion for teaching can easily…

articles about educational tools

Storytelling Made Fun: Animate Anything with Toontastic 3D in Minutes!

For educators finding innovative tools to engage students in learning can be a perpetual quest. Storytelling stands as one of the oldest and most powerful…

articles about educational tools

Penelope.AI: Your AI Co-Pilot for Effortless Manuscript Submission

Considering the continuous developments within academic publishing, the journey from manuscript completion to journal acceptance can often feel like navigating a maze. From adhering to…

articles about educational tools

The Future of Research is Here: Chat with Any PDF Using Powerful AI from ChatPDF

Information is abundant but time is limited, staying updated with the latest research can be a daunting task. Whether you’re a teacher, educator, student, professional,…

articles about educational tools

Video2Recipe: Turn Cooking Videos into Simple Recipes

Are you tired of pausing and rewinding cooking videos, trying to catch every ingredient and instruction? Do you wish there was an easier way to…

articles about educational tools

MapDeduce: Extract Answers & Insights Instantly – Conquer Documents, Contracts & Research

The ability to quickly extract answers and insights from vast amounts of data is crucial for businesses, researchers, and individuals alike. However, with the sheer…

articles about educational tools

Listening: Turn Any Text into Audio – Master Academic Papers on the Go

Keeping up with academic research can feel like a challenging task, especially for educators, researchers, lecturers, students, and coaches, who are constantly on the move.…

articles about educational tools

Write Like a Pro: ProWritingAid Makes You a More Polished, Confident Writer

The ability to communicate effectively through writing is paramount especially in the creator economy. Whether you’re a teacher crafting lesson plans, an educator providing feedback…

articles about educational tools

TeacherMatic: AI-powered Tools for Educators by Educators – Save Time, Elevate Learning

Teachers are faced with myriad challenges that can often leave them overwhelmed and overworked. From lesson planning to grading, administrative tasks to student engagement, the…

articles about educational tools

Teacher’s Best Friend: MyLessonPal Simplifies Your Day, Enhances Instruction, & Boosts Student Success

Teachers and educators often find themselves juggling numerous responsibilities, from lesson planning to delivering engaging instruction and assessing student progress. With the advent of technology,…

articles about educational tools

Grammarly: Level Up Your Writing & Improve Your Clarity Wherever You Write

In the world of writing, clarity and correctness are paramount. Whether you’re a professional writer, a student, an educator, or someone who simply wants to…

articles about educational tools

Foresight at Your Fingertips: Predict Deadlines, Analyze Data, & Master Projects with Savvy Planner

Staying organized and on top of deadlines is essential for success. Whether you’re a teacher, educator, student, coach, or consultant, managing multiple projects and tasks…

articles about educational tools

Turn Your Browser into a Powerhouse: Simplify Tasks, Streamline Workflow with TinaMind’s AI Assistant

Whether you’re a teacher managing lesson plans, an educator researching new teaching methodologies, a student juggling assignments, a coach organizing training sessions, a consultant optimizing…

articles about educational tools

Streamline Leave Management in Slack with SpockOffice: A Solution for Educational Institutions and Beyond

Managing leave efficiently is crucial for any organization’s smooth operation. This holds especially true for educational institutions, where teachers, educators, students, coaches, and consultants form…

articles about educational tools

More Than Just a To-Do List: GetZing Empowers You to Build Lasting Habits & Live Your Best Life

Juggling various responsibilities can be overwhelming, especially for educators, students, coaches, and consultants. Balancing work, personal life, and professional development often requires more than just…

Educational Tools: Educational portal for digital productivity

Educational Tools is an educational portal offering digital resources to enhance teaching and learning, founded by education and technology enthusiasts. It aims to increase the educational productivity of students, teachers and professionals with online tools and informative articles, while making learning effective through tutorials and tips. The site promotes inspiring and inclusive teaching, providing educational resources and ideas for educators.

Who is Educational Tools for?

Parents

Looking for resources for homeschooling.

Tutors

In search of tools to enhance their teaching sessions.

Instructional Designers

Instructional Designers

Wanting to integrate educational technologies into their courses.

School Administrators

School Administrators

Looking to improve the educational infrastructure of their institution.

Educational Software Developers

Educational Software Developers

Looking for inspiration for new creations.

Education Researchers

Education Researchers

Interested in the latest trends and tools.

Non-governmental Organizations

Non-governmental Organizations

NGOs involved in education for all.

Librarians

Wishing to offer educational resources to visitors.

Entrepreneurs

Entrepreneurs

Looking for innovative ideas and tools.

Career Changers

Career Changers

Eager to acquire new skills for their career development.

Students

In search of practical advice, educational resources, and inspiring ideas to excel in their mission.

Teachers

« The platform is an excellent find! »

Reviews

Lyvia , Student

« A must for discovering innovative teaching ideas and tools. »

Dimitri , Teacher

« It makes it much easier to choose the right tools. »

Daniel , Freelance

America has legislated itself into competing red, blue versions of education

American states passed a blizzard of education laws and policies over the past six years that aim to reshape how K-12 schools and colleges teach and present issues of race, sex and gender to the majority of the nation’s students — with instruction differing sharply by states’ political leanings, according to a Washington Post analysis .

See which states are restricting, requiring education on race and sex

Three-fourths of the nation’s school-aged students are now educated under state-level measures that either require more teaching on issues like race, racism, history, sex and gender, or which sharply limit or fully forbid such lessons, according to a sweeping Post review of thousands of state laws, gubernatorial directives and state school board policies. The restrictive laws alone affect almost half of all Americans aged 5 to 19.

How The Post is tracking education bills

Since 2017, 38 states have adopted 114 such laws, rules or orders, The Post found. The majority of policies are restrictive in nature: 66 percent circumscribe or ban lessons and discussions on some of society’s most sensitive topics, while 34 percent require or expand them. In one example, a 2023 Kentucky law forbids lessons on human sexuality before fifth grade and outlaws all instruction “exploring gender identity.” On the other hand, a 2021 Rhode Island law requires that all students learn “African Heritage and History” before high school graduation.

The Post included in its analysis only measures that could directly affect what students learn. Thus, 100 of the laws in The Post’s database apply only to K-12 campuses, where states have much greater power to shape curriculums. At public institutions of higher education — where courts have held that the First Amendment protects professors’ right to teach what they want — the laws instead target programs like student or faculty trainings or welcome sessions.

Tell The Post: How are education laws, restrictions affecting your school?

The divide is sharply partisan. The vast majority of restrictive laws and policies, close to 9o percent, were enacted in states that voted for Donald Trump in the 2020 presidential election, The Post found. Meanwhile, almost 80 percent of expansive laws and policies were enacted in states that voted for Joe Biden in 2020.

The explosion of laws regulating school curriculums is unprecedented in U.S. history for its volume and scope, said Jonathan Zimmerman, a University of Pennsylvania professor who studies education history and policy. Controversy and debate over classroom lessons is nothing new, Zimmermann said, but states have never before stepped in so aggressively to set rules for local schools. School districts have traditionally had wide latitude to shape their lessons.

He said it remains an open question whether all laws will translate to curriculum changes, predicting some schools and teachers may refuse to alter their pedagogy. Still, a nationally representative study from the Rand Corp. released this year found that 65 percent of K-12 teachers report they are limiting instruction on “political and social issues.”

“What the laws show is that we have extremely significant differences over how we imagine America,” Zimmerman said. “State legislatures have now used the power of law to try to inscribe one view, and to prevent another. And so we’re deeply divided in America.”

In practice, these divisions mean that what a child learns about, say, the role slavery played in the nation’s founding — or the possibility of a person identifying as nonbinary — may come to depend on whether they live in a red or blue state.

Legislators advancing restrictive education laws argue they are offering a corrective to what they call a recent left-wing takeover of education. They contend that, in the past decade or so, teachers and professors alike began forcing students to adopt liberal viewpoints on topics ranging from police brutality to whether gender is a binary or a spectrum.

Tennessee state Rep. John Ragan (R), who sponsored or co-sponsored several laws in his state that limit or ban instruction and trainings dealing with race, bias, sexual orientation and gender identity on both K-12 and college campuses, said the legislation he helped pass does not restrict education.

“It is restricting indoctrination,” Ragan said. Under his state’s laws, he said, “the information presented is factually accurate and is in fact something worth knowing.”

Those advancing expansive legislation, by contrast, argue they are fostering conditions in which students from all backgrounds will see themselves reflected in lessons. This will make it easier for every student to learn and be successful, while teaching peers to be tolerant of one another’s differences, said Washington state Sen. Marko Liias (D).

Liias was the architect of a law his state passed last month that requires schools to adopt “inclusive curricula” featuring the histories, contributions and perspectives of the “historically marginalized,” including “people from various racial, ethnic, and religious backgrounds, people with differing learning needs, people with disabilities [and] LGBTQ people.” He was inspired to propose the bill after hearing from educators who wanted to create more welcoming classrooms and by memories of his own experiences as a queer student in the 1980s and 1990s, when, he said, there were no LGBTQ role models taught or accepted in schools.

“When schools are inclusive broadly of all the identities brought to the classroom, then everybody thrives and does better,” Liias said.

To construct its database of education laws, The Post analyzed more than 2,200 bills, policies, gubernatorial directives and state school board rules introduced since 2017. The Post identified regulations for review by examining state legislative databases, education law trackers maintained by national bipartisan nonprofits and the websites of various advocacy groups that monitor curriculum legislation.

How curriculum policies took hold

Some blue states began enacting expansive education laws in the late 2010s. From 2017 to 2020, 10 states passed legislation or rules that required schools to start teaching about the history of underrepresented groups such as Black Americans, Pacific Islanders or LGBTQ Americans, The Post found .

State and school leaders were drawing on more than a dozen studies published from the 1990s to 2017 that found student performance, attendance and graduate rates rise when children see people like them included in curriculum, said Jennifer Berkshire , a Yale lecturer on education studies.

“They were thinking, ‘You know, our curriculums aren’t representative enough,’” Berkshire said. “The argument was, if we’re going to realize the goal of full rights and civil participation for kids, we need to do things differently.”

Fourteen of these laws, or 36 percent, came in a rush in 2021, the year after the police killing of George Floyd sparked massive demonstrations and a national reckoning over racism. At the time, activists, teachers, parents and high school students across America were urging schools teach more Black history and feature more Black authors.

Of the expansive laws and policies The Post analyzed, the majority — 69 percent — require or expand education on race or racial issues, especially on Black history and ethnic studies. About a quarter add or enhance education on both LGBTQ and racial issues. Just 8 percent focus solely on LGBTQ lives and topics.

But the onslaught of restrictive legislation in red states began in 2021, too, also inspired in many cases by parent concerns over curriculums.

Anxiety first stirred due to coronavirus pandemic-era school shutdowns as some mothers and fathers — granted an unprecedented glimpse into lessons during the era of school-by-laptop — found they did not like or trust what their children were learning.

Soon, some parents were complaining that lessons were biased toward left-leaning views and too focused on what they saw as irrelevant discussions of race, gender and sexuality — laments taken up by conservative pundits and politicians. National groups like Moms for Liberty formed to call out and combat left-leaning teaching in public schools.

Their fears became legislation with speed: Mostly red states passed 26 restrictive education laws and policies in 2021; 19 such laws or policies the next year and 25 more the year after that.

“If you’ve got parents upset at what they’re seeing, they’re going to go to school board meetings and take it up with their legislators,” said Robert Pondiscio , a senior fellow studying education at the conservative American Enterprise Institute. “And legislators will do what they do: pass laws.”

How the restrictions and expansions work

The plurality of restrictive laws, 47 percent, target both education on race and sex. About a third solely affect education on gender identity and sexuality, while 21 percent solely affect education on race.

Almost 40 percent of these laws work by granting parents greater control of the curriculum — stipulating that they must be able to review, object to or remove lesson material, as well as opt out of instruction. Schools have long permitted parents to weigh in on education, often informally; but under many of the new laws, parental input has more weight and is mandatory.

Another almost 40 percent of the laws forbid schools from teaching a long list of often-vague concepts related to race, sex or gender.

These outlawed concepts usually include the notion that certain merits, values, beliefs, status or privileges are tied to race or sex; or the theory that students should feel ashamed or guilty due to their race, sex or racial past. One such law, passed in Georgia in 2022, forbids teaching that “an individual, solely by virtue of his or her race, bears individual responsibility for actions committed in the past by other individuals of the same race.”

At the college level, among the measures passed in recent years is a 2021 Oklahoma law that prohibits institutions of higher education from holding “mandatory gender or sexual diversity training or counseling,” as well as any “orientation or requirement that presents any form of race or sex stereotyping.”

By contrast, a 2023 California measure says state community college faculty must employ “teaching, learning and professional practices” that reflect “anti-racist principles.”

Some experts predicted the politically divergent instruction will lead to a more divided society.

“When children are being taught very different stories of what America is, that will lead to adults who have a harder time talking to each other,” said Rachel Rosenberg, a Hartwick College assistant professor of education.

But Pondiscio said there is always tension in American society between the public interest in education and parents’ interest in determining the values transmitted to their children. The conflict veers from acute to chronic, he said, and currently it’s in an acute phase. “But I don’t find it inappropriate. I think it is a natural part of democratic governance and oversight,” Pondiscio said.

He added, “One man’s ‘chilling effect’ is another man’s appropriate circumspections.”

articles about educational tools

  • Our Mission

7 AI Tools That Help Teachers Work More Efficiently

These apps and websites can help teachers boost their productivity, personalize learning, and create lesson content.

Teacher using a tablet in class

Over the past five years, I’ve explored ways to integrate AI into my teaching practice—even before ChatGPT and other generative AI became some of the most talked-about topics in education. Every educator needs to learn about AI and how we can leverage this technology to benefit our students and enhance our own work. To best provide for our students, we need to understand how this technology will impact them and us. What better way than to explore new tools via AI in our teaching practice?

Why Educators Need to Understand AI Tools

Our roles as educators have continued to change over the years. With technology comes a bit of hesitancy, especially with something as powerful as AI. In our schools, we have to provide opportunities for students to learn about changing technology because of the impact it may have on their future. Not only can AI tools enhance creativity and productivity, but also they can provide educators with valuable insights into student learning and assist with some of the time-consuming tasks that educators have. 

Even with all of the promises of AI, it is important that we take time to talk about artificial intelligence in our classrooms. Not only do we teach the content, but we serve as mentors, facilitators of learning and co-learners with our students, especially as we embrace these emerging powerful technologies. It’s important that we help our students learn about the benefits of them and also show how to use these tools properly, responsibly, and ethically. 

How AI Can Improve A Teacher’s Job

Personalized learning: Educators can provide tailored learning experiences based on AI-driven analytics that provide valuable insights into student performance and learning trends. Using this data, AI can instantly adapt student learning materials. Teachers can then use this information to provide personalized learning experiences, adapting to each student’s strengths, weaknesses, and learning pace. 

Productivity and efficiency: Greater efficiency comes with AI as well. Educators are responsible for a variety of clerical tasks, such as communicating with students and their families, grading assessments, and providing feedback. Educators may find they spend more time on these clerical tasks rather than on teaching and working directly with students. The right AI tools can help to automate or streamline these tasks, which allows teachers to have additional time with their students.

Creating and supplementing content: Through AI-powered platforms, teachers can curate a range of educational resources. With generative AI in particular, teachers are able to create lessons, activities, assessments, prompts for discussion, and presentations simply by providing a short prompt with keywords.

7 education Tools that work

Here are seven AI-powered tools that will help teachers with personalized learning that enables them to become more efficient and save time that can then be spent with students. I have used each of these for my own personal writing and creating of presentations, and the amount of time they save by generating the slides alone helps me to focus more closely on the content. I also appreciate that the tools offer translation options and a variety of templates and other resources that are commonly used by educators.

1. AudioPen : For years, I have been using voice-to-text to write blogs, books, emails, and lesson plans. This is an AI-powered web app that you can use on your computer or phone. The app takes your words and enhances them as it generates the text, which you can edit as needed. 

2. Canva Magic Write : Canva now offers an AI text-to-image generator called Magic Write, which can inspire creativity in writing. It provides ideas, helps with brainstorming, and supports lesson planning, making it a useful tool for educators for creating a presentation or other graphic for classroom use. Magic Write can assist with many writing tasks that educators may have by analyzing the word prompts and then helping with brainstorming, creating an outline, writing lesson plans, or generating a visually engaging presentation in far less time.

3. Curipod : This website enables teachers to create interactive lessons in minutes using AI. Students can explore various topics, and the AI functionality helps generate customized lessons tailored to their learning needs. Teachers simply type in a topic, and a ready-to-run lesson is generated with text, images, and activities such as polls, open-ended responses, word clouds, and more. There are even activities to build in that focus on SEL check-ins.

4. Eduaide.Ai : This is an AI-assisted lesson-development tool that provides educators with more than 100 resource types to choose from to create high-quality instructional materials. It offers the ability to translate the generated content into more than 15 languages instantly. Educators can generate a syllabus, create discussion prompts, use the “teaching assistant” for help with creating individualized education program plans, write emails, or even compile a list of accommodations for students. Eduaide.AI has a content generator, teaching assistant, feedback bot, free-form chat, and assessment builder.

5. OpenAI : The recently released Teaching with AI guide for teachers was created to help educators use ChatGPT in their classroom. The guide comes with several suggested prompts and includes explanations that clarify exactly how ChatGPT works and what its limitations are, and it provides reminders of the importance of verifying information and checking for bias. With ChatGPT 4, which is a paid version, there is greater accuracy and reliability of information than with the original version. 

6. Quizizz : With Quizizz, teachers can design quizzes that will create a personalized learning path based on each student’s responses. Teachers can also create lessons with Quizizz, which now has an AI enhancement that can adjust question difficulty, check grammar, and redesign questions to reflect real-world scenarios, with more features on the way.

7. Slidesgo : This tool provides access to free templates via Google Slides and now has the AI Presentation Maker. With this new functionality, presentations can be created within minutes. Simply choose a topic; select a tone such as casual, creative, or professional; make changes; and download your presentation. A time-saver for sure!

Creating and sharing these resources with our students leads to rich conversations about the benefits of AI and proper use of this technology for creating and learning. There are many more tools available for teachers to explore that can help with each of the key areas that I mentioned. Most important is selecting a tool to start with and reflecting on how it impacted your practice.

Cart

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

Corporate Learning Is Boring — But It Doesn’t Have to Be

  • Duncan Wardle

articles about educational tools

Four lessons from Disney on infusing creativity into employee training.

Most corporate learnings aren’t cutting it. Almost 60% of employees say they’re interested in upskilling and training, but 57% of workers also say they’re already pursuing training outside of work. The author, the former Head of Innovation and Creativity at Disney, argues that creativity is the missing piece to make upskilling engaging and effective. From his experience, he shares four strategies to unlock creativity in trainings: 1) Encourage “What if?”, 2) respond “How else?” to challenges, 3) give people time to think by encouraging playfulness, and 4) make training a game.

With almost 60% of employees saying they are very or extremely interested in participating in upskilling programs per a joint Gallup-Amazon survey, chances are your employees are interested too. So why are 57% of workers  taking their education into their own hands? Most corporate learning programs aren’t cutting it because they lack the necessary element of creativity .

  • DW Duncan Wardle formerly vice president of innovation and creativity at The Walt Disney Company, launched his  creative consulting company  iD8 & innov8 to help companies embed a culture of innovation and creativity across their entire organization. Duncan spent his 25-year career at Disney developing some of its most  innovative ideas and strategies  — ideas that would forever change the way the company expands its impact, trains its employees, and solves problems creatively. He has a new book releasing in fall 2024 titled “The Imagination Emporium,” a tool kit that makes innovation accessible, creativity tangible, and the process fun.

Partner Center

articles about educational tools

More From Forbes

The 15 top ai-powered tools for automated unit testing, what is unit testing, what are unit testing examples, top automated unit testing softwares and tools.

  • Share to Facebook
  • Share to Twitter
  • Share to Linkedin

Artificial intelligence can be a huge help to humans writing unit testing scripts.

Software development is a creative endeavor, but it can be filled with tedious tasks. Most mundane of all is writing “unit tests,” bits of code to verify that software components work as intended. Unit tests help developers catch bugs early and ensure that code can be maintained.

Ideally, developers writing code for a program write unit tests as they go along. But writing unit tests is the drudge work of software development and it can take up a significant amount of a developer’s time. Worse, developers can make mistakes in manually written tests for complex codebases. Consequently, a lot of software lacks adequate unit tests and that makes the code difficult to maintain. Without unit tests, if something breaks, finding the problem can be like hunting for a needle in a haystack.

A unit is a part of a program that performs a particular operation. Units are the building blocks of software; a piece of software is a stack of units. If a unit doesn’t do what it’s supposed to do, the software program will not work efficiently — or, in some cases, will not work at all.

Unit testing involves testing individual units of a software application in isolation to ensure they function as expected. The process involves identifying the unit to be tested, writing a test case that exercises the unit and verifies its behavior, running the test and observing the results. If the test fails, the developer investigates the issue, makes necessary changes to the unit's code, and re-runs the test until it passes. Unit testing helps catch bugs early, improves code quality and enables faster debugging, ultimately ensuring the reliability and quality of the software application.

Automated unit testing will play an increasingly critical role in maintaining the integrity and robustness of the code that runs much of our lives. Artificial intelligence is now helping developers write those unit tests, freeing them to focus on higher-value tasks.

Best High-Yield Savings Accounts Of 2024

Best 5% interest savings accounts of 2024.

Diffblue logo

One of the most promising tools is Diffblue Cover , an AI-powered platform that automatically writes unit tests for Java code, one of the most popular programming languages. Diffblue uses reinforcement learning to analyze the codebase and generate human-readable, executable tests that cover a wide range of scenarios. "We only focus on unit tests,” said Peter Schrammel, Diffblue’s co-founder, explaining his company’s success in solving the problem for Java.

Another notable tool is EvoSuite , an open-source framework that uses genetic algorithms to generate test suites for Java programs, although EvoSuite's generated tests are not as readable as those produced by Diffblue.

Both Diffblue and EvoSuite stand out for being completely automatic.

Beyond that, there are many code suggestion tools that can help developers write tests. While these tools speed up the work of writing unit tests, they are not fully automatic and still require a developer’s time and attention.

For example, Amazon CodeWhisperer , GitHub Copilot , and even ChatGPT can look at a function and predict a unit test. But generative AI based on large language models (LLMs) is prone to errors, and so developers still need to check their work. They help developers, but don’t free them from the unit-test writing task.

The space is starting to heat up with new start-ups entering the race. Startup Cognition has announced an AI agent called Devin that it claims can scan code, identify and fix bugs and write unit tests. A Cognition engineer named Andrew posted a video on YouTube of Devin reviewing a code repository and successfully writing a regression test on its own. Devin is not yet publicly available so it will take some time to see if it can challenge the current leaders.

Here are the top tools on the market today for writing unit tests. These tools use various AI techniques to automate and optimize different aspects of code review, test generation and quality assurance.

DiffBlue Cover

DiffBlue Cover provides AI-powered unit test generation for Java codebases.

  • Automated Java unit test generation tool
  • Uses reinforcement learning to generate and optimize tests
  • Integrates with popular Java integrated development environments (IDEs) and build tools
  • Achieves high code coverage and maintains tests over time
  • Offers both cloud and on-premises deployment options

GitHub Copilot

GitHub Copilot is powered by generative AI models developed by GitHub, OpenAI and Microsoft, and is trained on all natural languages that appear in public repositories.

  • AI pair programmer that suggests code and entire functions in real-time
  • Supported in terminals through GitHub CLI (command line interface) and natively integrated into GitHub.com with the GitHub Copilot Enterprise plan
  • Suggests code completions in the code editor
  • Answers questions in a chat
  • Automatically pulls relevant context from the opened project

Tabnine is an AI coding assistent supporting multiple languages and IDEs.

  • AI coding assistant; supporting code generation, explanation, and fixes across 80+ languages and frameworks, plus support for automatic generation of tests and documentation.
  • Support for a broad set of IDEs, including all of the most popular (eg., Visual Studio, VSCode, IntelliJ, Eclipse, Android Studio)
  • Pulls context automatically from all relevant files accessible from the IDE, and can be connected to any Git-based repo for increased context.
  • Offers a proprietary model trained exclusively on permissive code, and also offers custom models trained on additional customer's code (trained and deployed privately)
  • Offers deployment in secure SaaS, or private deployments on VPC or on-premises (can be fully air-gapped)

CodiumAI Codiumate

CodiumAI Codiumate is an AI coding assistant for writing, reviewing and testing code.

  • IDE plugin for interactive high-quality code generation, testing and reviewing
  • Interactively generates a task plan and spec
  • Suggests task-aware code completions in the code editor
  • Provides guidance, code improvements, task review, etc. to generate high-quality code to complete the task
  • Private instances can be installed on-premises
  • Uses a proprietary model (but enterprises can choose to use OpenAI models instead)

Google Cloud's Duet

Google Cloud's Duet provides AI-powered code completion and generation for developers

  • Chat interface for coding questions and guidance on cloud best practices
  • Code explanation to quickly understand, map and navigate unfamiliar code bases
  • Code security guardrails to scan AI-generated code for vulnerabilities
  • Leverages Google AI foundation models
  • Source citations to help comply with license requirements

Amazon Q/Amazon Codewhisperer

Amazon Q/Amazon Codewhisperer is an AI-powered coding companion from Amazon Web Services.

  • Accessible directly in popular IDEs
  • Proposes code snippets to full functions, across 15 programming languages
  • Provides company-specific responses through customization capability
  • Scans for security vulnerabilities and suggests remediation in code
  • Filters out code suggestions that may be considered biased or unfair
  • Flags code suggestions that may resemble particular open-source training data
  • Upgrades programming language versions
  • Builds new application features with a descriptive prompt
  • Uses a proprietary model

Symflower provides automated unit test generation for Java.

  • Combines symbolic execution, static analysis, and natural language processing
  • Generates readable, maintainable, and effective unit tests
  • Explains test assertions and edge cases in natural language
  • Integrates with Java IDEs and continuous integration/continuous delivery (CI/CD) pipelines

Testim is an AI-based test automation platform for web and mobile apps.

  • AI-powered test automation platform
  • Supports web, mobile and API testing
  • Uses machine learning to create and maintain tests
  • Provides visual test editing and debugging tools
  • Integrates with popular CI/CD tools and test management systems

Squaretest is a plugin for IntelliJ IDEA that automatically generates unit tests for Java classes.

  • Uses dataflow analysis, control flow analysis, pattern detection and heuristics to generate as much of the tests as it can. Manual work is required to complete the generated tests.
  • Enables developers to customize output by creating custom Apache Velocity templates.
  • Enables developers to choose which dependencies should be mocked, which methods should be tested and how to construct the source class.

Bito is an AI-powered code review and quality assurance tool.

  • Analyzes code changes and understands your codebase, providing real-time feedback
  • Identifies potential bugs, security issues and performance bottlenecks
  • Supports multiple programming languages and frameworks
  • Integrates with popular version control systems and CI/CD tools

DeepUnitAI is an AI tool that writes unit tests for multiple programming langauges.

  • AI-driven unit test generation tool
  • Supports vrious langauges including Typescript, Javascript, Java, Python and C#
  • Uses deep learning to understand code semantics and generate meaningful tests
  • Provides IDE extensions, CI/CD pipelines and CLI option.

Seniordev.ai

Seniordev.ai is an AI programming assistant for code generation, optimization and mentoring.

  • Web-based application designed to enable dev teams to work more efficiently and effectively
  • Uses AI to review pull requests, create/update docs and generate unit tests where applicable
  • Provides a collaborative interface for team members to work together
  • Integrates with popular version control systems and project management tools

Testsigma.com

Testsigma.com is an AI-driven, codeless test automation platform for web and mobile.

  • AI-driven test automation platform for web, mobile and API testing
  • Supports codeless test creation using natural language processing
  • Provides a visual interface for creating and managing tests
  • Offers real-time test results and analytics

Functionize

Functionize is an intelligent test automation platform that uses machine learning.

  • AI-powered test automation platform for web and mobile applications
  • Uses natural language processing and machine learning to create and maintain tests
  • Supports cross-browser and cross-device testing

Mabl is an AI-powered, codeless test automation platform for web applications.

  • Built using cloud, AI and low-code innovations
  • Functional and non-functional testing scalability across web apps, mobile apps, APIs, performance and accessibility testing
  • Delivers 3x faster test creation, 70% maintenance reduction, 10x faster test runs, 80% savings over homegrown solutions
  • Integrations with Slack, Jira, Microsoft Teams and Github

Bottom Line

As the field of AI-assisted development continues to evolve, we can expect to see more sophisticated automated unit testing tools that leverage advanced machine learning techniques to generate even more comprehensive and reliable test suites. These tools will likely integrate seamlessly with development workflows, making it easier for developers to incorporate automated testing into their daily routines.

Craig S. Smith

  • Editorial Standards
  • Reprints & Permissions

COMMENTS

  1. How technology is reinventing K-12 education

    In 2023 K-12 schools experienced a rise in cyberattacks, underscoring the need to implement strong systems to safeguard student data. Technology is "requiring people to check their assumptions ...

  2. Realizing the promise: How can education technology improve learning

    Here are five specific and sequential guidelines for decisionmakers to realize the potential of education technology to accelerate student learning. 1. Take stock of how your current schools ...

  3. Education reform and change driven by digital technology: a

    Based on Table 6, it is apparent that the highest number of articles in the domain of digital technology in education research were published in Education and Information Technologies (47 articles ...

  4. Understanding the role of digital technologies in education: A review

    The introduction of new technology-assisted learning tools such as mobile devices, smartboards, MOOCs, tablets, laptops, simulations, dynamic visualisations, and virtual laboratories have altered education in schools and institutions. The Internet of Things (IoT) is proven to be one of the most cost-effective methods of educating young brains. ...

  5. The Pros and Cons of 7 Digital Teaching Tools

    But to help you start thinking about how digital tools can remain useful to you, here's a summary of the advantages and disadvantages of seven of the most common ones. I also share when I use each one to help spur your thinking. 1. Recorded Lecture Videos. Recording yourself giving lectures is perhaps the simplest digital approach.

  6. 10 Favorite Online Teaching Tools Used by Educators This Year

    Student work #1. Image by Theresa Wills. Student work #2. FlipGrid - You might be familiar with this website's ability to capture short videos of students' responses, but what puts it in ...

  7. Multimedia tools in the teaching and learning processes: A systematic

    About one-third of the tools identified from the review were web-based although they were used largely in university teaching and learning. Examples of these tools are: online teaching and learning resource platform (Zhang, 2012), graphic web-based application (Bánsági and Rodgers, 2018), multimedia tool for teaching optimization (Jian-hua ...

  8. How Technology Is Changing the Future of Higher Education

    Tony Cenicola/The New York Times. This article is part of our latest Learning special report. We're focusing on Generation Z, which is facing challenges from changing curriculums and new ...

  9. Making New Digital Tools Work for Your Classroom

    Making New Tech Tools Work for Your Classroom. New technology has a strong presence in today's educational model, and teachers can use related tools to encourage student voice and enhance learning. The phrase "the three Rs: reading, 'riting, and 'rithmetic," according to Wikipedia, "appears to have been coined at the beginning of ...

  10. Educational technology: what it is and how it works

    This paper presents an argument that education—the giving and receiving of systematic instruction, the process of facilitating learning, constituted from countless methods, tools, and structures, operated by teachers and many others—may usefully be seen as a technological phenomenon; that all educators are thus educational technologists (albeit that their choices of technology may vary ...

  11. Artificial intelligence (AI) learning tools in K-12 education: A

    Artificial intelligence (AI) literacy is a global strategic objective in education. However, little is known about how AI should be taught. In this paper, 46 studies in academic conferences and journals are reviewed to investigate pedagogical strategies, learning tools, assessment methods in AI literacy education in K-12 contexts, and students' learning outcomes. The investigation reveals ...

  12. The Evolution Of Technology In The Classroom

    Educational technology isn't new, but meaningfully integrating tech in a modern learning environment can be a significant challenge for educators today. ... "Teachers need to be comfortable and confident with the tools they ask students to use," says Harris. Professional development for new technologies is crucial, as are supportive IT ...

  13. Putting Learning First With New Tech Tools

    Designing learning activities that leverage the collaborative nature of digital tools allows students to explore a topic while sitting in different classrooms or time zones. And when students share a screen—leaning over to discuss, record, and dive into media together—they also build transferable skills.

  14. 11 Digital Education Tools For Teachers And Students

    Here we present 11 of the most popular. 1. Edmodo. Edmodo is an educational tool that connects teachers and students, and is assimilated into a social network. In this one, teachers can create online collaborative groups, administer and provide educational materials, measure student performance, and communicate with parents, among other functions.

  15. Articles

    AI Art Generator: Unleash Your Creative Potential with BlueWillow. 24 January 2024. Many educators report spending considerable time (often hours) searching for appropriate visuals. This disrupts their workflow and takes away from lesson planning and other classroom…. Bluewillow.

  16. The role of pedagogical tools in active learning: a case for sense

    Evidence from the research literature indicates that both audience response systems (ARS) and guided inquiry worksheets (GIW) can lead to greater student engagement, learning, and equity in the STEM classroom. We compare the use of these two tools in large enrollment STEM courses delivered in different contexts, one in biology and one in engineering.

  17. 10 Teacher Picks for Best Tech Tools

    9. Flip: One of the most popular tech tools in schools, Flip won praise from teachers across the country because of the flexibility it gives students to submit digital projects and how it effectively supports peer and teacher feedback. 8. Edpuzzle: I've used Edpuzzle for a while, but it became much more vital as more of my content shifted online.

  18. Microsoft Education tools that supported learning in 2021

    Helping students with reading: The introduction of Reading Progress, a Microsoft Teams tool that assists students in building their reading skills and fluency, helped build more confident learners. Social emotional learning: Acknowledging and sharing feelings takes practice. The introduction of the Feelings Monster helped it become a daily ...

  19. Educational Tools: Thinking Outside the Box

    Educational tools, especially those related to technology, are populating the market faster than ever before (1,2). The transition to active learning approaches, with the learner more engaged in the process rather than passively taking in information, necessitates a variety of tools to help ensure success. As with most educational initiatives ...

  20. Educational Technology Tools & Articles

    The Future Of eLearning: Leveraging 3D Visualization And Configuration. Discover how 3D visualization and configuration are revolutionizing eLearning, optimizing engagement and enriching learning experiences. by Derek T. Belford. All you need to know about the wide field of Educational Technology from the authors of eLearning Industry. Tell us ...

  21. More Teachers Are Using AI-Detection Tools. Here's Why ...

    Low-income students may also be more likely to get in trouble for using AI, the CDT report said because they are more likely to use school-issued devices. Nearly half the teachers in the survey ...

  22. Innovative Tools for 21st Century Learning

    The innovation in educational tools is a testament to the evolving needs of 21st-century learners and educators. As technology advances, so too does the potential for these tools to transform the educational landscape. While not an exhaustive list, the examples provided highlight the breadth and depth of options available for enhancing learning ...

  23. Best Free Formative Assessment Tools and Apps

    Quizlet Quizlet's vast database of multimedia study sets includes a variety ideal for formative assessment, from flashcards to multiple-choice quizzes, to the asteroid game Gravity. Free for basic features; the premium Quizlet Plus account allows for customization and tracking student progress. 30-day free trial, then $35.99 annually for ...

  24. Educational Tools

    Educational Tools is an educational portal offering digital resources to enhance teaching and learning, founded by education and technology enthusiasts. It aims to increase the educational productivity of students, teachers and professionals with online tools and informative articles, while making learning effective through tutorials and tips. ...

  25. America has reshaped education into red and blue versions

    April 4, 2024 at 5:30 a.m. EDT. 10 min. American states passed a blizzard of education laws and policies over the past six years that aim to reshape how K-12 schools and colleges teach and present ...

  26. AI Tools for Teachers

    4. Eduaide.Ai: This is an AI-assisted lesson-development tool that provides educators with more than 100 resource types to choose from to create high-quality instructional materials. It offers the ability to translate the generated content into more than 15 languages instantly.

  27. Corporate Learning Is Boring

    Most corporate learnings aren't cutting it. Almost 60% of employees say they're interested in upskilling and training, but 57% of workers also say they're already pursuing training outside ...

  28. Army rolls out new digital learning platform

    By Erika Christ April 10, 2024. U.S. Army Program Executive Office Enterprise Information Systems (PEO EIS) has launched ATIS Learning, a new and improved enterprise learning management system ...

  29. Microsoft Copilot Academy now generally available

    Recognizing the need to develop a new skillset around AI tools, we are excited to introduce Microsoft Copilot Academy, a new addition to our Viva Learning platform designed to help customers effectively utilize Copilot experiences through guided upskilling. Copilot Academy is generally available now from your Viva Learning Teams app or webapp.

  30. What Is Unit Testing? 15 Automated Unit Testing Tools

    Squaretest is a plugin for IntelliJ IDEA that automatically generates unit tests for Java classes. Uses dataflow analysis, control flow analysis, pattern detection and heuristics to generate as ...