• Reference Manager
  • Simple TEXT file

People also looked at

Review article, innovative pedagogies of the future: an evidence-based selection.

education model research paper

  • Institute of Educational Technology, The Open University, Milton Keynes, United Kingdom

There is a widespread notion that educational systems should empower learners with skills and competences to cope with a constantly changing landscape. Reference is often made to skills such as critical thinking, problem solving, collaborative skills, innovation, digital literacy, and adaptability. What is negotiable is how best to achieve the development of those skills, in particular which teaching and learning approaches are suitable for facilitating or enabling complex skills development. In this paper, we build on our previous work of exploring new forms of pedagogy for an interactive world, as documented in our Innovating Pedagogy report series. We present a set of innovative pedagogical approaches that have the potential to guide teaching and transform learning. An integrated framework has been developed to select pedagogies for inclusion in this paper, consisting of the following five dimensions: (a) relevance to effective educational theories, (b) research evidence about the effectiveness of the proposed pedagogies, (c) relation to the development of twenty-first century skills, (d) innovative aspects of pedagogy, and (e) level of adoption in educational practice. The selected pedagogies, namely formative analytics, teachback, place-based learning, learning with drones, learning with robots, and citizen inquiry are either attached to specific technological developments, or they have emerged due to an advanced understanding of the science of learning. Each one is presented in terms of the five dimensions of the framework.

Introduction

In its vision for the future of education in 2030, the Organization for Economic Co-operation and Development ( OECD, 2018 ) views essential learner qualities as the acquisition of skills to embrace complex challenges and the development of the person as a whole, valuing common prosperity, sustainability and wellbeing. Wellbeing is perceived as “inclusive growth” related to equitable access to “ quality of life, including health, civic engagement, social connections, education, security, life satisfaction and the environment” (p. 4). To achieve this vision, a varied set of skills and competences is needed, that would allow learners to act as “change agents” who can achieve positive impact on their surroundings by developing empathy and anticipating the consequences of their actions.

Several frameworks have been produced over the years detailing specific skills and competences for the citizens of the future (e.g., Trilling and Fadel, 2009 ; OECD, 2015 , 2018 ; Council of the European Union, 2018 ). These frameworks refer to skills such as critical thinking, problem solving, team work, communication and negotiation skills; and competences related to literacy, multilingualism, STEM, digital, personal, social, and “learning to learn” competences, citizenship, entrepreneurship, and cultural awareness ( Trilling and Fadel, 2009 ; Council of the European Union, 2018 ). In a similar line of thinking, in the OECD Learning Framework 2030 ( OECD, 2018 ) cognitive, health and socio-emotional foundations are stressed, including literacy, numeracy, digital literacy and data numeracy, physical and mental health, morals, and ethics.

The question we are asked to answer is whether the education vision of the future, or the development of the skills needed to cope with an ever-changing society, has been met, or can be met. The short answer is not yet. For example, the Programme for International Student Assessment (PISA) has been ranking educational systems based on 15-year-old students' performance on tests about reading, mathematics and science every 3 years in more than 90 countries. In the latest published report ( OECD, 2015 ), Japan, Estonia, Finland, and Canada are the four highest performing OECD countries in science. This means that students from these countries on average can “ creatively and autonomously apply their knowledge and skills to a wide variety of situations, including unfamiliar ones” ( OECD, 2016a , p.2). Yet about 20% of students across participating countries are shown to perform below the baseline in science and proficiency in reading ( OECD, 2016b ). Those most at risk are socio-economically disadvantaged students, who are almost three times more likely than their peers not to meet the given baselines. These outcomes are quite alarming; they stress the need for evidence-based, effective, and innovative teaching and learning approaches that can result in not only improved learning outcomes but also greater student wellbeing. Overall, an increasing focus on memorization and testing has been observed in education, including early years, that leaves no space for active exploration and playful learning ( Mitchell, 2018 ), and threatens the wellbeing and socioemotional growth of learners. There is an increased evidence-base that shows that although teachers would like to implement more active, innovative forms of education to meet the diverse learning needs of their students, due to a myriad of constraints teachers often resort to more traditional, conservative approaches to teaching and learning ( Ebert-May et al., 2011 ; Herodotou et al., 2019 ).

In this paper, we propose that the distance between educational vision and current teaching practice can be bridged through the adoption and use of appropriate pedagogy that has been tested and proven to contribute to the development of the person as a whole. Evidence of impact becomes a central component of the teaching practice; what works and for whom in terms of learning and development can provide guidelines to teaching practitioners as to how to modify or update their teaching in order to achieve desirable learning outcomes. Educational institutions may have already adopted innovations in educational technology equipment (such as mobile devices), yet this change has not necessarily been accompanied by respective changes in the practice of teaching and learning. Enduring transformations can be brought about through “pedagogy,” that is improvements in “the theory and practice of teaching, learning, and assessment” and not the mere introduction of technology in classrooms ( Sharples, 2019 ). PISA analysis of the impact of Information Communication Technology (ICT) on reading, mathematics, and science in countries heavily invested in educational technology showed mixed effects and “no appreciable improvements” ( OECD, 2015 , p.3).

The aim of this study is to review and present a set of innovative, evidence-based pedagogical approaches that have the potential to guide teaching practitioners and transform learning processes and outcomes. The selected pedagogies draw from the successful Innovating Pedagogy report series ( https://iet.open.ac.uk/innovating-pedagogy ), produced by The Open University UK (OU) in collaboration with other centers of research in teaching and learning, that explore innovative forms of teaching, learning and assessment. Since 2012, the OU has produced seven Innovating Pedagogy reports with SRI international (USA), National Institute of Education (Singapore), Learning In a NetworKed Society (Israel), and the Center for the Science of Learning & Technology (Norway). For each report, teams of researchers shared ideas, proposed innovations, read research papers and blogs, and commented on each other's draft contributions in an organic manner ( Sharples et al., 2012 , 2013 , 2014 , 2015 , 2016 ; Ferguson et al., 2017 , 2019 ). Starting from an initial list of potential promising educational innovations that may already be in currency but not yet reached a critical mass of influence on education, these lists were critically and collaboratively examined, and reduced to 9–11 main topics identified as having the potential to provoke major shifts in educational practice.

After seven years of gathering a total of 70 innovative pedagogies, in this paper seven academics from the OU, authors of the various Innovating Pedagogy reports, critically reflected on which of these approaches have the strongest evidence and/or potential to transform learning processes and outcomes to meet the future educational skills and competences described by OECD and others. Based upon five criteria and extensive discussions, we selected six approaches that we believe have the most evidence and/or potential for future education:

• Formative analytics,

• Teachback,

• Place-based learning,

• Learning with robots,

• Learning with drones,

• Citizen inquiry.

Formative analytics is defined as “supporting the learner to reflect on what is learned, what can be improved, which goals can be achieved, and how to move forward” ( Sharples et al., 2016 , p.32). Teachback is a means for two or more people to demonstrate that they are progressing toward a shared understanding of a complex topic. Place-based learning derives learning opportunities from local community settings, which help students connect abstract concepts from the classroom and textbooks with practical challenges encountered in their own localities. Learning with robots could help teachers to free up time on simple, repetitive tasks, and provide scaffolding to learners. Learning with drones is being used to support fieldwork by enhancing students' capability to explore outdoor physical environments. Finally, citizen inquiry describes ways that members of the public can learn by initiating or joining shared inquiry-led scientific investigations.

Devising a Framework for Selection: The Role of Evidence

Building on previous work ( Puttick and Ludlow, 2012 ; Ferguson and Clow, 2017 ; Herodotou et al., 2017a ; John and McNeal, 2017 ; Batty et al., 2019 ), we propose an integrated framework for how to select pedagogies. The framework resulted from ongoing discussions amongst the seven authors of this paper as to how educational practitioners should identify and use certain ways of teaching and learning, while avoiding others. The five components of the model are presented below:

• Relevance to effective educational theories: the first criterion refers to whether the proposed pedagogy relates to specific educational theories that have shown to be effective in terms of improving learning.

• Research evidence about the effectiveness of the proposed pedagogies: the second criterion refers to actual studies testing the proposed pedagogy and their outcomes.

• Relation to the development of twenty-first century skills: the third criterion refers to whether the pedagogy can contribute to the development of the twenty-first century skills or the education vision of 2030 (as described in the introduction section).

• Innovative aspects of pedagogy: the fourth criterion details what is innovative or new in relation to the proposed pedagogy.

• Level of adoption in educational practice: the last criterion brings in evidence about the current level of adoption in education, in an effort to identify gaps in our knowledge and propose future directions of research.

A major component of the proposed framework is effectiveness , or the generation of evidence of impact . The definition of what constitutes evidence varies ( Ferguson and Clow, 2017 ; Batty et al., 2019 ), and this often relates to the quality or strength of evidence presented. The Strength of Evidence pyramid by John and McNeal (2017) (see Figure 1 ) categorizes different types of evidence based on their strength, ranging from expert opinions as the least strong type of evidence to meta-analysis or synthesis as the strongest or most reliable form of evidence. While the bottom of the pyramid refers to “practitioners' wisdom about teaching and learning,” the next two levels refer to peer-reviewed and published primary sources of evidence, both qualitative and quantitative. They are mostly case-studies, based on either the example of a single institution, or a cross-institutional analysis involving multiple courses or institutions. The top two levels involve careful consideration of existing resources of evidence and inclusion in a synthesis or meta-analysis. For example, variations of this pyramid in medical studies present Randomized Control Trials (RCTs) at the second top level of the pyramid, indicating the value of this approach for gaining less biased quality evidence.

www.frontiersin.org

Figure 1 . The strength of evidence pyramid ( John and McNeal, 2017 ).

Another approach proposed by the innovation foundation Nesta presents evidence on a scale of 1 to 5, showcasing the level of confidence with the impact of an intervention ( Puttick and Ludlow, 2012 ). Level 1 studies describe logically, coherently and convincingly what has been done and why it matters, while level 5 studies produce manuals ensuring consistent replication of a study. The evidence becomes stronger when studies prove causality (e.g., through experimental approaches) and can be replicated successfully. While these frameworks are useful for assessing the quality or strength of evidence, they do not make any reference to how the purpose of a study can define which type of evidence to collect. Different types of evidence could effectively address different purposes; depending on the objective of a given study a different type of evidence could be used ( Batty et al., 2019 ). For example, the UK government-funded research work by the Educational Endowment Foundation (EEF) is using RCTs, instead of for example expert opinions, as the purpose of their studies is to capture the impact of certain interventions nationally across schools in the UK.

Education, as opposed to other disciplines such as medicine and agriculture, has been less concerned with evaluating different pedagogical approaches and determining their impact on learning outcomes. The argument often made is the difficulty in evaluating learning processes, especially through experimental methodologies, due to variability in teaching conditions across classrooms and between different practitioners, that may inhibit any comparisons and valid conclusions. In particular, RCTs have been sparse and often criticized as not explaining any impact (or absence of impact) on learning, a limitation that could be overcome by combining RCT outcomes with qualitative methodologies ( Herodotou et al., 2017a ). Mixed-methods evaluations could identify how faithfully an intervention is applied to different learning contexts or for example, the degree to which teachers have been engaged with it. An alternative approach is Design-Based Research (DBR); this is a form of action-based research where a problem in the educational process is identified, solutions informed by existing literature are proposed, and iterative cycles of testing and refinement take place in order to identify what works in practice in order to improve the solution. DBR often results in guidelines or theory development (e.g., Anderson and Shattuck, 2012 ).

An evidence-based mindset in education has been recently popularized through the EEF. Their development of the teaching and learning toolkit provides an overview of existing evidence about certain approaches to improving teaching and learning, summarized in terms of impact on attainment, cost and the supporting strength of evidence. Amongst the most effective teaching approaches are the provision of feedback, development of metacognition and self-regulation, homework for secondary students, and mastery learning ( https://educationendowmentfoundation.org.uk ). Similarly, the National Center for Education and Evaluation (NCEE) in the US conducts large-scale evaluations of education programs with funds from the government. Amongst the interventions with the highest effectiveness ratings are phonological awareness training, reading recovery, and dialogic reading ( https://ies.ed.gov/ncee/ ).

The importance of evidence generation is also evident in the explicit focus of Higher Education institutions in understanding and increasing educational effectiveness as a means to: tackle inequalities and promote educational justice (see Durham University Evidence Center for Education; DECE), provide high quality education for independent and lifelong learners (Learning and Teaching strategy, Imperial College London), develop criticality and deepen learning (London Center for Leadership in Learning, UCL Institute of Education), and improve student retention and performance in online and distance settings [Institute of Educational Technology (IET) OU].

The generation of evidence can help identify or debunk possible myths in education and distinguish between practitioners' beliefs about what works in their practice as opposed to research evidence emerging from systematically assessing a specific teaching approach. A characteristic example is the “Learning Styles” myth and the assumption that teachers should identify and accommodate each learner's special way of learning such as visual, auditory and kinesthetic. While there is no consistent evidence that considering learning styles can improve learning outcomes (e.g., Rohrer and Pashler, 2010 ; Kirschner and van Merriënboer, 2013 ; Newton and Miah, 2017 ), many teachers believe in learning styles and make efforts in organizing their teaching around them ( Newton and Miah, 2017 ). In the same study, one third of participants stated that they would continue to use learning styles in their practice despite being presented with negative evidence. This suggests that we are rather in the early days of transforming the practice of education and in particular, developing a shared evidence-based mindset across researchers and practitioners.

In order to critically review the 70 innovative pedagogies from the seven Innovating Pedagogy reports, over a period of 2 months the seven authors critically evaluated academic and gray literature that was published after the respective reports were launched. In line with the five criteria defined above, each author contributed in a dynamic Google sheet what evidence was available for promising approaches. Based upon the initial list of 70, a short-list of 10 approaches was pre-selected. These were further fine-tuned to the final six approaches identified for this study based upon the emerging evidence of impact available as well as potential opportunity for future educational innovation. The emerging evidence and impact of the six approaches were peer-reviewed by the authoring team after contributions had been anonymized, and the lead author assigned the final categorizations.

In the next section, we present each of the proposed pedagogies in relation to how they meet the framework criteria, in an effort to understand what we know about their effectiveness, what evidence exist showcasing impact on learning, how each pedagogy accommodates the vision of the twenty-first century skills development, innovation aspects and current levels of adoption in educational practice.

Selected Pedagogies

Formative analytics, relevance to effective educational theories.

As indicated by the Innovating Pedagogy 2016 report ( Sharples et al., 2016 , p.32), “formative analytics are focused on supporting the learner to reflect on what is learned, what can be improved, which goals can be achieved, and how to move forward.” In contrast to most analytics approaches that focus on analytics of learning, formative analytics aims to support analytics for learning, for a learner to reach his or her goals through “smart” analytics, such as visualizations of potential learning paths or personalized feedback. For example, these formative analytics might help learners to effectively self-regulate their learning. Zimmerman (2000) defined self-regulation as “self-generated thoughts, feelings and actions that are planned and cyclically adapted to the attainment of personal learning goals.” Students have a range of choices and options when they are learning in blended or online environments as to when, what, how, and with whom to study, with minimal guidance from teachers. Therefore, “appropriate” Self-Regulated Learning (SRL) strategies are needed for achieving individual learning goals ( Hadwin et al., 2011 ; Trevors et al., 2016 ).

With the arrival of fine-grained log-data and the emergence of learning analytics there are potentially more, and perhaps new, opportunities to map how to support students with different SRL ( Winne, 2017 ). With trace data on students' affect (e.g., emotional expression in text, self-reported dispositions), behavior (e.g., engagement, time on task, clicks), and cognition (e.g., how to work through a task, mastery of task, problem-solving techniques), researchers and teachers are able to potentially test and critically examine pedagogical theories like SRL theories on a micro as well as macro-level ( Panadero et al., 2016 ; D'Mello et al., 2017 ).

Research Evidence About the Effectiveness of the Proposed Pedagogies

There is an emergence of literature that uses formative analytics to support SRL and to understand how students are setting goals and solve computer-based tasks ( Azevedo et al., 2013 ; Winne, 2017 ). For example, using the software tool nStudy ( Winne, 2017 ) recently showed that trace data from students in forms of notes, bookmarks, or quotes can be used to understand the cycles of self-regulation. In a study of 285 students learning French in a business context, using log-file data ( Gelan et al., 2018 ) found that engaged and self-regulated students outperformed students who were “behind” in their study. In an introductory mathematics course amongst 922 students, Tempelaar et al. (2015) showed that a combination of self-reported learning dispositions from students in conjunction with log-data of actual engagement in mathematics tasks provide effective formative analytics feedback to students. Recently, Fincham et al. (2018) found that formative analytics could actively encourage 1,138 engineering learners to critically reflect upon one of their eight adopted learning strategies, and where needed adjust it.

Relation to the Development of Twenty-First Century Skills

Beyond providing markers for formative feedback on cognitive skills (e.g., mastery of mathematics, critical thinking), formative analytics tools have also been used for more twenty-first century affective (e.g., anxiety, self-efficacy) and behavioral (e.g., group working) skills. For example, a group widget developed by Scheffel et al. (2017) showed that group members were more aware of their online peers and their contributions. Similarly, providing automatic computer-based assessment feedback on mastery of mathematics exercises but also providing different options to work-out the next task allowed students with math anxiety to develop more self-efficacy over time when they actively engaged with formative analytics ( Tempelaar et al., 2018 ). Although implementing automated formative analytics is relatively easier with structured cognitive tasks (e.g., multiple choice questions, calculations), there is an emerging body of research that focuses on using more complex and unstructured data, such as text as well as emotion data ( Azevedo et al., 2013 ; Panadero et al., 2016 ; Trevors et al., 2016 ), that can effectively provide formative analytics beyond cognition.

Innovative Aspects of Pedagogy

By using fine-grained data and reporting this directly back to students in the form of feedback or dashboards, the educational practice is substantially influenced, and subsequently innovated. In particular, instead of waiting for feedback from a teacher at the end of an assessment task, students can receive formative analytics on demand (when they want to), or ask for the formative analytics that link to their own self-regulation strategies. This is a radical departure from more traditional pedagogies that either place the teacher at the center, or expect students to be fully responsible for their SRL.

Level of Adoption in Educational Practice

Beyond the widespread practice of formative analytics in computer-based assessment ( Scherer et al., 2017 ), there is an emerging field of practice whereby institutions are providing analytics dashboards directly to students. For example, in a recent review on the use of learning analytics dashboards, Bodily et al. (2018) conclude that many dashboards use principles and conceptualizations of SRL, which could be used to support teachers and students, assuming they have the capability to use these tools. However, substantial challenges remain as to how to effectively provide these formative analytics to teachers ( Herodotou et al., 2019 ) and students ( Scherer et al., 2017 ; Tempelaar et al., 2018 ), and how to make sure positive SRL strategies nested within students are encouraged and not hampered by overly prescriptive and simplistic formative analytics solutions.

The method of Teachback, and the name, were originally devised by the educational technologist Gordon Pask (1976) , as a means for two or more people to demonstrate that they are progressing toward a shared understanding of a complex topic. It starts with an expert, teacher, or more knowledgeable student explaining their knowledge of a topic to another person who has less understanding. Next, the less knowledgeable student attempts to teach back what they have learned to the more knowledgeable person. If that is successful, the one with more knowledge might then explain the topic in more detail. If the less knowledgeable person has difficulty in teaching back, the person with more expertise tries to explain in a clearer or different way. The less knowledgeable person teaches it again until they both agree.

A classroom teachback session could consist of pairs of students taking turns to teach back to each other a series of topics set by the teacher. For example, a science class might be learning the topic of “eclipses.” The teacher splits the class into pairs and asks one student in each pair to explain to the other what they know about “eclipse of the sun.” Next, the class receives instruction about eclipses from the teacher, or a video explanation. Then, the second student in the pair teaches back what they have just learned. The first student asks questions to clarify such as, “What do you mean by that?” If either student is unsure, or the two disagree, then they can ask the teacher. The students may also jointly write a short explanation, or draw a diagram of the eclipse, to explain what they have learned.

The method is based on the educational theory of “radical constructivism” (e.g., von Glaserfeld, 1987 ) which sees knowledge as an adaptive process, allowing people to cope in the world of experience by building consensus through mutually understood language. It is a cybernetic theory, not a cognitive one, in which structured conversation and feedback among individuals create a system that “comes to know” by creating areas of mutual understanding.

Some doctors and healthcare professionals have adopted teachback in their conversations with patients to make sure they understand instructions on how to take medication and manage their care. In a study by Negarandeh et al. (2013) with 43 diabetic patients, a nurse conducted one 20-min teachback session for each patient, each week over 3 weeks. A control group ( N = 40) spent similar times with the nurse, but received standard consultations. The nurse asked questions such as “When you get home, your partner will ask you what the nurse said. What will you tell them?” Six weeks after the last session, those patients who learned through teachback knew significantly more about how to care for their diabetes than the control group patients. Indeed, a systematic review study of 12 published articles covering teachback for patients showed positive outcomes on a variety of measures, though not all were statistically significant ( Ha Dinh et al., 2016 ).

Teachback has strong relevance in a world of social and conversational media, with “fake news” competing for attention alongside verified facts and robust knowledge. How can a student “come to know” a new topic, especially one that is controversial. Teachback can be a means to develop the skills of questioning knowledge, seeking understanding, and striving for agreement.

The conversational partner in Teachback could be an online tutor or fellow student, or an Artificial Intelligence (AI) system that provides a “teachable agent”. With a teachable agent, the student attempts to teach a recently-learned topic to the computer and can see a dynamic map of the concepts that the computer agent has “learned” ( www.teachableagents.org/ ). The computer could then attempt to teachback the knowledge. Alternatively, AI techniques can enhance human teachback by offering support and resources for a productive conversation, for example to search for information or clarify the meaning of a term.

Rudman (2002) demonstrated a computer-based variation on teachback. In this study, one person learned the topic of herbal remedies from a book and became the teacher. A second person then attempted to learn about the same topic by holding a phone conversation with the more-knowledgeable teacher. The phone conversation between the two people was continually monitored by an AI program that detected keywords in the spoken dialogue. Whenever the AI program recognized a keyword or phrase in the conversation (such as the name of a medicinal herb, or its properties), it displayed useful information on the screen of the learner, but not the teacher. Giving helpful feedback to the learner balanced the conversation, so that both could hold a more constructive discussion.

The method has seen some adoption into medical practice ( https://bit.ly/2Xr9qY5 ). It has also been tested at small scale for science education ( Gutierrez, 2003 ). Reciprocal teaching has been adopted in some schools for teaching of reading comprehension ( Oczuks, 2003 ).

Placed-Based Learning

Place-based learning derives learning opportunities from local community settings. These help students to connect abstract concepts from the classroom and textbooks with practical challenges encountered in their own localities. “Place” can refer to learning about physical localities, but also the social and cultural layers embedded within neighborhoods; and engaging with communities and environments as well as observing them. It can be applied as much to arts and humanities focused learning as science-based learning. Place-based learning can encompass service learning, where students, and teachers solve local community problems, and through place-based learning acquire and learn a range of skills ( Sobel, 2004 ). Mobile and networked technologies have opened up new possibilities for constructing and sharing knowledge, and reaching out to different stakeholders. Learning can take place while mobile, enabling communication across students and teachers, and beyond the field site. The physical and social aspects of the environment can be enhanced or augmented by digital layers to enable a richer experience, and greater access to resources and expertise.

Place-based learning draws upon experiential models of learning (e.g., Kolb, 1984 ), where active engagement with a situation and resulting experiences are reflected upon to help conceptualize learning, which in turn may trigger further explorations or experimentation. It may be structured as problem-based learning. Unplanned or unintentional learning outcomes may occur as a result of engagements, so place-based learning also draws on incidental learning (e.g., Kerka, 2000 ). Place-based learning declares that a more “authentic” and meaningful learning experience can happen in relevant environments, aligning with situated cognition, that states that knowledge is situated within physical, social and cultural contexts ( Brown et al., 1989 ). Learning episodes are often encountered with and through other people, a form of socio-cultural learning (e.g., Vygotsky, 1978 ). Networked technologies can enhance what experiences may be possible, and through the connections that might be made, recently articulated as connectivism (e.g., Siemens, 2005 ; Ito et al., 2013 ).

Place-based learning draws on a range of pedagogies, and in part derives its authority from research into their efficacy (e.g., experiential learning, situated learning, problem-based learning). For example, in a study of 400 US high school students Ernst and Monroe (2004) found that environment-based teaching both significantly improved students' critical thinking skills, and also their disposition toward critical thinking. Research has shown that learning is very effective if carried out in “contexts familiar to students' everyday lives” ( Bretz, 2001 , p.1112). In another study, Linnemanstons and Jordan (2017) found that educators perceived students to display greater engagement and understanding of concepts when learning through experiential approaches in a specific place. Semken and Freeman (2008) trialed a method to test whether “sense of place” could be measured as learning outcome when students are taught through place-based science activities. Using a set of psychometric surveys tested on a cohort of 31 students, they “observed significant gains in student place attachment and place meaning” (p.1042). In an analysis of 23 studies exploring indigenous education in Canada, Madden (2015) showed that place-based education can play an effective role in decolonizing curriculum, fostering understandings of shared histories between indigenous and non-indigenous learners in Canada. Context-aware systems that are triggered by place can provide location relevant learning resources ( Kukulska-Hulme et al., 2015 ), enhancing the ecology of tools available for place-based learning. However, prompts to action from digital devices might also be seen as culturally inappropriate in informal, community based learning where educational activities and their deployment needs to be considered with sensitivity ( Gaved and Peasgood, 2017 ).

Critical thinking and problem solving are central to this experiential-based approach to learning. Contextually based, place-based learning requires creativity and innovation by participants to manage and respond to often unexpected circumstances with unexpected learning opportunities and outcomes likely to arise. As an often social form of learning, communication and collaboration are key skills developed, with a need to show sensitivity to local circumstances. An ability to learn the skills to manage social and cross-cultural interactivity will be central for a range of subject areas taught through place-based learning, such as language learning or human geography. Increasingly, place-based learning is enhanced or augmented by mobile and networked technologies, so digital literacy skills need to be acquired to take full advantage of the tools now available.

Place-based learning re-associates learning with local contexts, at a time when educators are under pressure to fit into national curricula and a globalized world. It seeks to re-establish students with a sense of place, and recognize the opportunities of learning in and from local community settings, using neighborhoods as the specific context for experiential and problem-based learning. It can provide a mechanism for decolonializing curriculum, recognizing that specific spaces can be understood to have different meanings to different groups of people, and allowing diverse voices to be represented. Digital and networked technologies extend the potential for group and individual learning, reaching out and sharing knowledge with a wider range of stakeholders, enabling flexibility in learning, and a greater scale of interactions. Networked tools enable access to global resources, and learning beyond the internet, with smartphones and tablets (increasingly owned by the learners themselves) as well as other digital tools linked together for gathering, analyzing and reflection on data and interactions. Context and location aware technologies can trigger learning resources on personal devices, and augment physical spaces: augmented reality tools can dynamically overlay data layers and context sensitive virtual information ( Klopfer and Squire, 2008 ; Wu et al., 2013 ).

Place-based learning could be said to pre-date formal classroom based learning in the traditional sense of work based learning (e.g., apprenticeships), or informal learning (e.g., informal language learning). Aspects of place-based learning have a long heritage, such as environmental education and learning though overcoming neighborhood challenge, with the focus on taking account of learning opportunities “beyond the schoolhouse gate” ( Theobald and Curtiss, 2000 ). Place-based learning aligns with current pedagogical interests in education that is “multidisciplinary, experiential, and aligned with cultural and ecological sustainability” ( Webber and Miller, 2016 , p.1067).

Learning With Robots

Learning through interaction and then reflecting upon the outcomes of these interactions prompted Papert (1980) to develop the Logo Turtles. It can be argued that these turtles were one of the first robots to be used in schools whose theoretical premises were grounded within a Constructivist approach to learning. Constructivism translates into a pedagogy where students actively engage in experimental endeavors often based within real–world problem solving undertakings. This was how the first turtles were used to assist children to understand basic mathematical concepts. Logo turtles have morphed into wheeled robots in current Japanese classrooms where 11- and 12-year olds learn how to program them and then compete in teams to create the code needed to guide their robots safely through an obstacle course. This latter approach encourages children to “Think and Learn Together with Information and Communication Technology” as discussed by Dawes and Wegerif (2004) . Vygotsky's theoretical influence is then foregrounded in this particular pedagogical context, where his sociocultural theory recognizes and emphasizes the role of language within any social interaction to prompt cognitive development.

The early work of Papert has been well documented but more recently Benitti (2012) reviewed the literature about the use of robotics in schools. The conclusions reached from this meta-analysis, where the purpose of each study was taken into account, together with the type of robot used and the demographics of the children who took party in the studies suggested that the use of robots in classrooms can enhance learning. This was found particularly with the practical teaching in STEM subjects, although some studies did not reveal improvements in learning. Further work by Ospennikova et al. (2015) showed how this technology can be applied to teaching physics in Russian secondary schools and supports the use of learning with robots in STEM subjects. Social robots for early language learning have been explored by Kanero et al. (2018) ; this has proved to be positive for story telling skills ( Westlund and Breazeal, 2015 ). Kim et al. (2013) have illustrated that social robots can assist with the production of more speech utterances for young children with ASD. However, none of the above studies illustrate that robots are more effective than human teachers, but this pedagogy is ripe for more research findings.

Teaching a robot to undertake a task through specific instructions mimics the way human teachers behave with pupils when they impart a rule set or heuristics to the pupils using a variety of rhetoric techniques in reaction to the learner's latest attempt at completing a given task. This modus operandi has been well documented by Jerome Bruner and colleagues and has been termed as “scaffolding” ( Wood et al., 1979 ). This latter example illustrates a growing recognition of the expanding communicative and expressive potential found through working with robots and encouraging teamwork and collaboration.

The robot can undertake a number of roles, with different levels of involvement in the learning task. Some of the examples mentioned above demonstrate the robot taking on a more passive role ( Mubin et al., 2013 ). This is when it can be used to teach programming, such as moving the robot on a physical route with many obstacles. Robots can also act as peers and learn together with the student or act as a teacher itself. The “interactive cat” (iCat) developed by Philips Research is an example of a robotic teacher helping language learning. It has a mechanical rendered cat face and can express emotion. This was an important feature with respect to social supportiveness, an important attribute belonging to human tutors. Research showed that social supportive behavior exhibited by the robot tutor had a positive effect on students' learning. The supportive behaviors exhibited by iCat tutor were non-verbal behavior, such as smiling, attention building, empathy, and communicativeness.

Interest in learning with robots in the classroom and beyond is growing but purchasing expensive equipment which will require technical support can prevent adoption. There are also ethical issues that need to be addressed since “conversations” with embodied robots that can support both learning and new forms of assessment must all sustain equity within an ethical framework. As yet these have not been agreed within the AI community.

Learning With Drones

Outdoor fieldwork is a long-standing student-centered pedagogy across a range of disciplines, which is increasingly supported by information technology ( Thomas and Munge, 2015 , 2017 ). Within this tradition, drone-based learning, a recent innovation, is being used to support fieldwork by enhancing students' capability to explore outdoor physical environments. When students engage in outdoor learning experiences, reflect on those experiences, conceptualize their learning and experiment with new actions, they are engaging in experiential learning ( Kolb, 1984 ). The combination of human senses with the multimedia capabilities of a drone (image and video capture) means that the learning experience can be rich and multimodal. Another key aspect is that learning takes place through research, scientific data collection and analysis; drones are typically used to assist with data collection from different perspectives and in places that can be difficult to access. In the sphere of informal and leisure learning in places such as nature reserves and cultural heritage sites ( Staiff, 2016 ), drone-based exploration is based on discovery and is a way to make the visitor experience more attractive.

There is not yet much research evidence on drone-based learning, but there are some case studies, teachers' accounts based on observations of their students, and pedagogically-informed suggestions for how drones may be applied to educational problems and the development of students' knowledge and practical skills. For example, a case study conducted in Malaysia with postgraduate students taking a MOOC ( Zakaria et al., 2018 ) was concerned with students working on a video creation task using drones, in the context of problem-based learning about local issues. The data analysis showed how active the students had been during a task which involved video shooting and editing/production. In the US, it was reported that a teacher introduced drones to a class of elementary students with autism in order to enhance their engagement and according to the teacher the results were “encouraging” since the students stayed on task better and were more involved with learning ( Joch, 2018 ). In the context of education in Australia, Sattar et al. (2017) give suggestions for using drones to develop many kinds of skills, competences and understanding in various disciplines, also emphasizing the learners' active engagement.

Sattar et al. (2017) argue that using drone technology will prepare and equip students with the technical skills and expertise which will be in demand in future, enhance their problem-solving skills and help them cope with future technical and professional requirements; students can be challenged to develop skills in problem-solving, analysis, creativity and critical thinking. Other ideas put forward in the literature suggest that drone-based learning can stimulate curiosity to see things that are hidden from view, give experience in learning through research and analyzing data, and it can help with visual literacies including collecting visual data and interpreting visual clues. Another observation is that drone-based learning can raise issues of privacy and ethics, stimulating discussion of how such technologies should be used responsibly when learning outside the classroom.

Drones enable learners to undertake previously impossible actions on field trips, such as looking inside inaccessible places or inspecting a landscape from several different perspectives. There is opportunity for rich exploration of physical objects and spaces. Drone-based learning can be a way to integrate skills and literacies, particularly orientation and motor skills with digital literacy. It is also a new way to integrate studies with real world experiences, showing students how professionals including land surveyors, news reporters, police officers and many others use drones in their work. Furthermore, it has been proposed as an assistive technology, enabling learners who are not mobile to gain remote access to sites they would not be able to visit ( Mangina et al., 2016 ).

Accounts of adoption into educational practice suggest that early adopters with an interest in technology have been the first to experiment with drones. There are more accounts of adoption in community settings, professional practice settings and informal learning than in formal education at present. For example, Hodgson et al. (2018) describe how ecologists use drones to monitor wildlife populations and changes in vegetation. Drones can be used to capture images of an area from different angles, enabling communities to collect evidence of environmental problems such as pollution and deforestation. They are used after earthquakes and hurricanes, to assess the damage caused by these disasters, to locate victims, to help deliver aid, and to enhance understanding of assistance needs ( Sandvik and Lohne, 2014 ). They also enable remote monitoring of illegal trade without having to confront criminals.

Citizen Inquiry

Citizen science is an increasingly popular activity that has the potential to support growth and development in learning science. Active participation by the public in scientific research encourages this. This is due to its potential to educate the public—including young people—and to support the development of skills needed for the workplace, and contribute to findings of real science research. An experience that allows people to become familiar with the work of scientists and learn to make their own science has potential for learning. Citizen science activities can take place online on platforms such as Zooniverse, which hosts some of the largest internet-based citizen science projects or nQuire ( nQuire.org.uk ), which scaffolds a wide range of inquiries, or can be offline in a local area (e.g., a bioblitz). In addition, mobile and networked technologies have opened up new possibilities for these investigations (see e.g., Curtis, 2018 ).

Most current citizen science initiatives engage the general public in some way. For example, they may be in the role of volunteers, often non-expert individuals, in projects generated by scientists such as species recognition and counting. In these types of collaboration the public contributes to data collection and analysis tasks such as observation and measurement. The key theory which underpins this work is that of inquiry learning. “ Inquiry-based learning is a powerful generalized method for coming to understand the natural and social world through a process of guided investigation ” ( Sharples et al., 2013 , p.38). It has been described as a powerful way to encourage learning by encouraging learners to use higher-order thinking skills during the conduct of inquiries and to make connections with their world knowledge.

Inquiry learning is a pedagogy with a long pedigree. First proposed by Dewey as learning through experience it came to the fore in the discovery learning movement of the sixties. Indeed, the term citizen inquiry has been coined which “ fuses the creative knowledge building of inquiry learning with the mass collaborative participation exemplified by citizen science, changing the consumer relationship that most people have with research to one of active engagement ” ( Sharples et al., 2013 , p.38).

Researchers using this citizen inquiry paradigm have described how it “ shifts the emphasis of scientific inquiry from scientists to the general public, by having non-professionals (of any age and level of experience) determine their own research agenda and devise their own science investigations underpinned by a model of scientific inquiry. It makes extensive use of web 2.0 and mobile technologies to facilitate massive participation of the public of any age, including youngsters, in collective, online inquiry-based activities” ( Herodotou et al., 2017b ). This shift offers more opportunities for learning in these settings.

Research has shown that learning can be developed in citizen science projects. Herodotou et al. (2018) citing a review by Bonney et al. (2009) have found that systematic involvement in citizen science projects produces learning outcomes in a number of ways, including increasing accuracy and degree of self-correction of observations. A number of studies have examined the learning which takes place during the use of iSpot (see Scanlon et al., 2014 ; Silvertown et al., 2015 ). Preliminary results showed that novice users can reach a fairly sophisticated understanding of identification over time ( Scanlon et al., 2014 ). Also, Aristeidou et al. (2017 , p 252) examined citizen science activities on nQuire, and reported that some participants perceived learning as a reason for feeling satisfied with their engagement, with comments such as “insight into some topics” and “new information.”

Through an online survey, Edwards et al. (2017) reported that citizen science participants of the UK Wetland Bird Survey and the Nest Record Scheme had learned on various dimensions. This was found to be related in part to their prior levels of education. Overall, there is a growing number of studies investigating the relationship between citizen science and learning with some positive indications that projects can be designed to encourage learning (Further studies on learning from citizen science are also discussed by Ballard et al., 2017 , and Boakes et al., 2016 ).

The skills required by citizens in the twenty-first century are those derived from citizen science projects. They “ need the skills and knowledge to solve problems, evaluate evidence, and make sense of complex information from various sources .” ( Ferguson et al., 2017 , p.12). As noted by OECD (2015) a significant skill students need to develop is learn to “ think like a scientist .” This is perceived as an essential skill across professions and not only the science-related ones. In particular, STEM education and jobs are no longer viewed as options for the few or for the “gifted.” “ Engagement with STEM can develop critical thinking, teamwork skills, and civic engagement. It can also help people cope with the demands of daily life. Enabling learners to experience how science is made can enhance their content knowledge in science, develop scientific skills and contribute to their personal growth. It can also increase their understanding of what it means to be a scientist ” ( Ferguson et al., 2017 , p.12).

One of the innovations of this approach is that it enables potentially any citizen to engage and understand scientific activities that are often locked behind the walls of experimental laboratories. Thinking scientifically should not be restricted to scientists; it should be a competency that citizens develop in order to engage critically and reflect on their surroundings. Such skills will enable critical understanding of public debates such as fake news and more active citizenship. Technologically, the development of these skills can be supported by platforms such as nQuire, the vision of which is to scaffold the process of scientific research and facilitate development of relevant skills amongst citizens.

Citizen science activities are mainly found in informal learning settings, with rather limited adoption to formal education. “ For example, the Natural History Museum in London offers citizen science projects that anyone can join as an enjoyable way to interact with nature. Earthworm Watch is one such project that runs every spring and autumn in the UK. It is an outdoor activity that asks people to measure soil properties and record earthworms in their garden or in a local green space. Access to museums such as the Natural History Museum is free of charge allowing all people, no matter what their background, to interact with such activities and meet others with similar interests .” ( Ferguson et al., 2017 , p.13) At the moment, adoption is dependent on individual educators rather than a policy. Two Open University examples are the incorporation of the iSpot platform into a range of courses from short courses such as Neighborhood Nature to MOOCs such as An introduction to Ecology on the FutureLearn platform. In recent years there are more accounts of citizen science projects within school settings (see e.g., Doyle et al., 2018 ; Saunders et al., 2018 ; Schuttler et al., 2018 ).

In this paper, we discussed six innovative approaches to teaching and learning that originated from seven Innovating Pedagogy reports ( Sharples et al., 2012 , 2013 , 2014 , 2015 , 2016 ; Ferguson et al., 2017 , 2019 ), drafted between 2012 and 2019 by leading academics in Educational Technology at the OU and institutions in the US, Singapore, Israel, and Norway. Based upon an extensive peer-review by seven OU authors, evidence and impact of six promising innovative approaches were gathered, namely formative analytics, teachback, place-based learning, learning with robots, learning with drones, and citizen inquiry. For these six approaches there is strong or emerging evidence that they can effectively contribute to the development of skills and competences such as critical thinking, problem-solving, digital literacy, thinking like a scientist, group work, and affective development.

The maturity of each pedagogy in terms of evidence generation varies with some pedagogies such as learning with drones being less mature and others such as formative analytics being more advanced. In Table 1 , we used the evidence classifications in Figures 1 , 2 to provide our own assessment of the overall quality of evidence (strength of evidence and level of confidence (scale 1–5) based on NESTA's standards of evidence shown in Figure 2 ) for each pedagogy, as a means to identify gaps in current knowledge and direct future research efforts.

www.frontiersin.org

Table 1 . Future directions of selected pedagogies.

www.frontiersin.org

Figure 2 . Standards of evidence by Nesta.

The proposed pedagogies have great potential in terms of reducing the distance between aspirations or vision for the future of education and current educational practice. This is evident in their relevance to effective educational theories including experiential learning, inquiry learning, discovery learning, and self-regulated learning, all of which are interactive and engaging ways of learning. Also, the review of existing evidence showcases their potential to support learning processes and desirable learning outcomes in both the cognitive and emotional domain. Yet, this list of pedagogies is not exhaustive; additional pedagogies that could potentially meet the selection criteria—and which can be found in the Innovating Pedagogy report series—are for example, playful learning emphasizing the need for play, exploration and learning through failure, virtual studios stressing learning flexibility through arts and design, and dynamic assessment during which assessors support learners in identifying and overcoming learning difficulties.

Conclusions

In this paper we presented six approaches to teaching and learning and stressed the importance of evidence in transforming the educational practice. We devised and applied an integrated framework for selection that could be used by both researchers and educators (teachers, pre-service teachers, educational policy makers etc.) as an assessment tool for reflecting on and assessing specific pedagogical approaches, either currently in practice or intended to be used in education in the future. Our framework goes beyond existing frameworks that focus primarily on the development of skills and competences for the future, by situating such development within the context of effective educational theories, evidence from research studies, innovative aspects of the pedagogy, and its adoption in educational practice. We made the case that learning is a science and that the testing of learning interventions and teaching approaches before applying these to practice should be a requirement for improving learning outcomes and meeting the expectations of an ever-changing society. We wish this work to spark further dialogue between researchers and practitioners and signal the necessity for evidence-based professional development that will inform and enhance the teaching practice.

Author Contributions

CH: introduction, discussion, confusion sections, revision of manuscript. MS: teachback. MG: place-based learning. BR: formative analytics. ES: citizen inquiry. AK-H: learning with drones. DW: learning with robots.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Anderson, T., and Shattuck, J. (2012). Design-based research: a decade of progress in education research? Educ. Res. 41, 16–25. doi: 10.3102/0013189X11428813

CrossRef Full Text | Google Scholar

Aristeidou, M., Scanlon, E., and Sharples, M. (2017). “Design processes of a citizen inquiry community,” in Citizen Inquiry: Synthesising Science and Inquiry Learning , eds C. Herodotou, M. Sharples, and E. Scanlon (Abingdon: Routledge), 210–229. doi: 10.4324/9781315458618-12

Azevedo, R., Harley, J., Trevors, G., Duffy, M., Feyzi-Behnagh, R., Bouchet, F., et al. (2013). “Using trace data to examine the complex roles of cognitive, metacognitive, and emotional self-regulatory processes during learning with multi-agent systems,” in International Handbook of Metacognition and Learning Technologies , eds R. Azevedo and V. Aleven (New York, NY: Springer New York), 427–449. doi: 10.1007/978-1-4419-5546-3_28

Ballard, H. L., Dixon, C. G. H., and Harris, E. M. (2017). Youth-focused citizen science: examining the role of environmental science learning and agency for conservation. Biol. Conserv. 208, 65–75. doi: 10.1016/j.biocon.2016.05.024

Batty, R., Wong, A., Florescu, A., and Sharples, M. (2019). Driving EdTech Futures: Testbed Models for Better Evidence . London: Nesta.

Google Scholar

Benitti, F. B. V. (2012). Exploring the educational potential of robotics in schools: a systematic review. Comput. Educ. 58, 978–988. doi: 10.1016/j.compedu.2011.10.006

Boakes, E. H., Gliozzo, G., Seymour, V., Harvey, M., Smith, C., Roy, D. B., et al. (2016). Patterns of contribution to citizen science biodiversity projects increase understanding of volunteers' recording behaviour. Sci. Rep. 6:33051. doi: 10.1038/srep33051

PubMed Abstract | CrossRef Full Text | Google Scholar

Bodily, R., Kay, J., Aleven, V., Jivet, I., Davis, D., Xhakaj, F., et al. (2018). “Open learner models and learning analytics dashboards: a systematic review,” in Proceedings of the 8th International Conference on Learning Analytics and Knowledge (Sydney, NSW: ACM), 41–50.

Bonney, R., Cooper, C. B., Dickinson, J., Kelling, S., Phillips, T., Rosenberg, K. V., et al. (2009). Citizen science: a developing tool for expanding science knowledge and scientific literacy. Bioscience 59, 977–984. doi: 10.1525/bio.2009.59.11.9

Bretz, S. L. (2001). Novak's theory of education: human constructivism and meaningful learning. J. Chem. Educ. 78:1107. doi: 10.1021/ed078p1107.6

Brown, J. S., Collins, A., and Duguid, P. (1989). Situated cognition and the culture of learning. Educ. Res. 18, 32–42. doi: 10.3102/0013189X018001032

Council of the European Union (2018). Council Recommendations of 22 May 2018 on Key Competences for Lifelong Learning . Brussel: Council of the European Union.

Curtis, V. (2018). “Online citizen science and the widening of academia: distributed engagement with research and knowledge production,” in Palgrave Studies in Alternative Education (Cham: Palgrave Macmillan). doi: 10.1007/978-3-319-77664-4

Dawes, L., and Wegerif, R. (2004). Thinking and Learning With ICT: Raising Achievement in Primary Classrooms . London: Routledge. doi: 10.4324/9780203506448

D'Mello, S., Dieterle, E., and Duckworth, A. (2017). Advanced, analytic, automated (AAA) measurement of engagement during learning. Educ. Psychol. 52, 104–123. doi: 10.1080/00461520.2017.1281747

Doyle, C., Li, Y., Luczak-Roesch, M., Anderson, D., Glasson, B., Boucher, M., et al. (2018). What is Online Citizen Science Anyway? An Educational Perspective. arXiv [Preprint]. arXiv:1805.00441.

Ebert-May, D., Derting, T. L., Hodder, J., Momsen, J. L., Long, T. M., and Jardeleza, S. E. (2011). What we say is not what we do: effective evaluation of faculty professional development programs. BioScience 61, 550–558. doi: 10.1525/bio.2011.61.7.9

Edwards, R., McDonnell, D., Simpson, I., and Wilson, A. (2017). “Educational backgrounds, project design and inquiry learning in citizen science,” in Citizen Inquiry: Synthesising Science and Inquiry Learning , eds C. Herodotou, M. Sharples, and E. Scanlon (Abingdon: Routledge), 195–209. doi: 10.4324/9781315458618-11

Ernst, J., and Monroe, M. (2004). The effects of environment-based education on students' critical thinking skills and disposition toward critical thinking. Environ. Educ. Res. 10, 507–522. doi: 10.1080/1350462042000291038

Ferguson, R., Barzilai, S., Ben-Zvi, D., Chinn, C. A., Herodotou, C., Hod, Y., et al. (2017). Innovating Pedagogy 2017: Open University Innovation Report 6 . Milton Keynes: The Open University.

Ferguson, R., and Clow, D. (2017). “Where is the evidence? A call to action for learning analytics,” in Proceedings of the 6th Learning Analytics Knowledge Conference (Vancouver, BC: ACM), 56–65.

Ferguson, R., Coughlan, T., Egelandsdal, K., Gaved, M., Herodotou, C., Hillaire, G., et al. (2019). Innovating Pedagogy 2019: Open University Innovation Report 7 . Milton Keynes: The Open University.

Fincham, O. E., Gasevic, D., Jovanovic, J. M., and Pardo, A. (2018). From study tactics to learning strategies: an analytical method for extracting interpretable representations. IEEE Trans. Lear. Technol. 12, 59–72. doi: 10.1109/TLT.2018.2823317

Gaved, M., and Peasgood, A. (2017). Fitting in versus learning: a challenge for migrants learning languages using smartphones. J. Interact. Media Educ. 2017:1. doi: 10.5334/jime.436

Gelan, A., Fastré, G., Verjans, M., Martin, N., Janssenswillen, G., Creemers, M., et al. (2018). Affordances and limitations of learning analytics for computer-assisted language learning: a case study of the VITAL project. Comp. Assist. Lang. Learn. 31, 294–319. doi: 10.1080/09588221.2017.1418382

Gutierrez, R. (2003). “Conversation theory and self-learning,” in Science Education Research in the Knowledge-Based Society , eds D. Psillos, P. Kariotoglou, V. Tselfes, E. Hatzikraniotis, G. Fassoulopoulos, and M. Kallery (Dordrecht: Springer Netherlands, 43–49.

Ha Dinh, T. T., Bonner, A., Clark, R., Ramsbotham, J., and Hines, S. (2016). The effectiveness of the teach-back method on adherence and self-management in health education for people with chronic disease: a systematic review. JBI Database Syst. Rev. Implement. Rep. 14, 210–247. doi: 10.11124/jbisrir-2016-2296

Hadwin, A., Järvelä, S., and Miller, M. (2011). “Self-regulated, co-regulated, and socially shared regulation of learning,” in Handbook of Self-Regulation of Learning and Performance , eds B. Zimmerman and D. Schunk (New York, NY: Routledge), 65–84.

Herodotou, C., Aristeidou, M., Sharples, M., and Scanlon, E. (2018). Designing citizen science tools for learning: lessons learnt from the iterative development of nQuire. Res Pract. Technol. Enhanced Lear. 13:4. doi: 10.1186/s41039-018-0072-1

Herodotou, C., Heiser, S., and Rienties, B. (2017a). Implementing randomised control trials in open and distance learning: a feasibility study. Open Learn. 32, 147–162. doi: 10.1080/02680513.2017.1316188

Herodotou, C., Rienties, B., Verdin, B., and Boroowa, A. (2019). Predictive learning analytics ‘at scale’: guidelines to successful implementation in higher education. J. Learn. Anal. 6, 85–95. doi: 10.18608/jla.2019.61.5

Herodotou, C., Sharples, M., and Scanlon, E. (2017b). “Introducing citizen inquiry,” in Citizen Inquiry: Synthesising Science and Inquiry Learning , eds C. Herodotou, M. Sharples, E. Scanlon (Routledge).

Hodgson, J., Terauds, A., and Pin Koh, L. (2018). ‘ Epic Duck Challenge’ Shows Drones Can Outdo People at Surveying Wildlife [Online]. The Conversation . Available online at: https://theconversation.com/epic-duck-challenge-shows-drones-can-outdo-people-at-surveying-wildlife-90018 (accessed May 23, 2019).

Ito, M., Gutiérrez, K., Livingstone, S., Penuel, B., Rhodes, J., Salen, K., et al. (2013). Connected Learning: An Agenda for Research and Design . Irvine, CA: Digital Media and Learning Research Hub.

Joch, A. (2018, March 27). With drones, students tackle complex topics. EdTech Magazine , Online article.

John, K. S., and McNeal, K. (2017). The Strength of Evidence Pyramid [Online]. National Association of Geoscience Teachers . Available online at: https://nagt.org/nagt/profdev/workshops/geoed_research/pyramid.html (accessed May 23, 2019).

Kanero, J., Geçkin, V., Oranç, C., Mamus, E., Küntay, A. C., and Göksun, T. (2018). Social robots for early language learning: current evidence and future directions. Child Dev. Perspect. 12, 146–151. doi: 10.1111/cdep.12277

Kerka, S. (2000). “Incidental learning,” in Trends and Issues Alert (Columbus, OH: Center on Education and Training for Employment, Ohio State University).

Kim, E. S., Berkovits, L. D., Bernier, E. P., Leyzberg, D., Shic, F., Paul, R., et al. (2013). Social robots as embedded reinforcers of social behavior in children with autism. J. Autism Dev. Disord. 43, 1038–1049. doi: 10.1007/s10803-012-1645-2

Kirschner, P. A., and van Merriënboer, J. J. G. (2013). Do learners really know best? urban legends in education. Educ. Psychol. 48, 169–183. doi: 10.1080/00461520.2013.804395

Klopfer, E., and Squire, K. (2008). Environmental detectives—the development of an augmented reality platform for environmental simulations. Educ. Technol. Res. Dev. 56, 203–228. doi: 10.1007/s11423-007-9037-6

Kolb, D. (1984). Experiential Learning as the Science of Learning and Development . Englewood Cliffs, NJ: Prentice Hall.

Kukulska-Hulme, A., Gaved, M., Paletta, L., Scanlon, E., Jones, A., and Brasher, A. (2015). Mobile incidental learning to support the inclusion of recent immigrants. Ubiquitous Learn. 7, 9–21. doi: 10.18848/1835-9795/CGP/v07i02/58070

Linnemanstons, K. A., and Jordan, C. M. (2017). Learning through place: evaluation of a professional development program for understanding the impact of place-based education and teacher continuing education needs. J. Sustain. Educ. 12, 1–25. Retrieved from: http://www.susted.com/wordpress/content/learning-through-place-evaluation-of-a-professional-development-program-for-understanding-the-impact-of-place-based-education-and-teacher-continuing-education-needs_2017_02/

Madden, B. (2015). Pedagogical pathways for Indigenous education with/in teacher education. Teach. Teach. Educ. 51, 1–15. doi: 10.1016/j.tate.2015.05.005

Mangina, E., O' Keeffe, E., Eyerman, J., and Goodman, L. (2016). “Drones for live streaming of visuals for people with limited mobility,” in 2016 22nd International Conference on Virtual System & Multimedia (VSMM) , 1–6. doi: 10.1109/VSMM.2016.7863162

Mitchell, R. (2018). Experts Warn Play Time is ‘Disappearing’ as Emphasis is Placed on Performance and Tests [Online]. The West Australian . Available online at: http://bit.ly/2FTIVGh (accessed June 27, 2018).

Mubin, O., Stevens, C. J., Shahid, S., Al Mahmud, A., and Dong, J.-J. (2013). A review of the applicability of robots in education. J. Technol. Educ. Learn. 1, 209–216. doi: 10.2316/Journal.209.2013.1.209-0015

Negarandeh, R., Mahmoodi, H., Noktehdan, H., Heshmat, R., and Shakibazadeh, E. (2013). Teach back and pictorial image educational strategies on knowledge about diabetes and medication/dietary adherence among low health literate patients with type 2 diabetes. Prim. Care Diabetes 7, 111–118. doi: 10.1016/j.pcd.2012.11.001

Newton, P. M., and Miah, M. (2017). Evidence-based higher education – Is the learning styles ‘myth’ important? Front. Psychol. 8:444. doi: 10.3389/fpsyg.2017.00444

Oczuks, L. (2003). Reciprocal Teaching at Work: Strategies for Improving Reading Comprehension . Newark, DE: International Reading Association.

OECD (2015). Students, Computers and Learning: Making the Connection, PISA . Paris: OECD Publishing. doi: 10.1787/9789264239555-en

OECD (2016a). United Kingdom Country Note. Programme for International Student Assessment (PISA) – Results from PISA 2015 . Paris: OECD Publishing.

OECD (2016b) PISA 2015 Results (Volume I): Excellence and Equity in Education . Paris: OECD Publishing.

OECD (2018). The Future of Education and Skills. Education 2030 . Paris: OECD Publishing.

Ospennikova, E., Ershov, M., and Iljin, I. (2015). Educational robotics as an inovative educational technology. Proc. Soc. Behav. Sci. 214, 18–26. doi: 10.1016/j.sbspro.2015.11.588

Panadero, E., Klug, J., and Järvelä, S. (2016). Third wave of measurement in the self-regulated learning field: when measurement and intervention come hand in hand. Scand. J. Educ. Res. 60, 723–735. doi: 10.1080/00313831.2015.1066436

Papert, S. (1980). Mindstorms: Children, Computers and Powerful Ideas . New York, NY: Basic Books.

Pask, G. (1976). Conversation Theory, Applications in Education and Epistemology . Amsterdam: Elsevier.

Puttick, R., and Ludlow, J. (2012). Standards of Evidence for Impact Investing . London: Nesta.

Rohrer, D., and Pashler, H. (2010). Recent research on human learning challenges conventional instructional strategies. Educ. Res. 39, 406–412. doi: 10.3102/0013189X10374770

Rudman, P. (2002). Investigating domain information as dynamic support for the learner during spoken conversations (Unpublished Ph.D thesis). University of Birmingham, Birmingham.

Sandvik, K. B., and Lohne, K. (2014). The rise of the humanitarian drone: giving content to an emerging concept. Millennium 43, 145–164. doi: 10.1177/0305829814529470

Sattar, F., Tamatea, L., and Nawaz, M. (2017). Droning the pedagogy: future prospect of teaching and learning. Int. J. Educ. Pedagog. Sci . 11, 1622–1627.

Saunders, M. E., Roger, E., Geary, W. L., Meredith, F., Welbourne, D. J., Bako, A., et al. (2018). Citizen science in schools: engaging students in research on urban habitat for pollinators. Austral Ecol. 43, 635–642. doi: 10.1111/aec.12608

Scanlon, E., Woods, W., and Clow, D. (2014). Informal participation in science in the UK: identification, location and mobility with iSpot. J. Educ. Technol. Soc. 17, 58–71.

Scheffel, M., Drachsler, H., de Kraker, J., Kreijns, K., Slootmaker, A., and Specht, M. (2017). Widget, widget on the wall, am I performing well at all? IEEE Trans. Learn. Technol. 10, 42–52. doi: 10.1109/TLT.2016.2622268

Scherer, R., Greiff, S., and Kirschner, P. A. (2017). Editorial to the special issue: current innovations in computer-based assessments. Comput. Hum. Behav. 76, 604–606. doi: 10.1016/j.chb.2017.08.020

Schuttler, S. G., Sears, R. S., Orendain, I., Khot, R., Rubenstein, D., Rubenstein, N., et al. (2018). Citizen science in schools: students collect valuable mammal data for science, conservation, and community engagement. Bioscience 69, 69–79. doi: 10.1093/biosci/biy141

Semken, S., and Freeman, C. B. (2008). Sense of place in the practice and assessment of place-based science teaching. Sci. Educ. 92, 1042–1057. doi: 10.1002/sce.20279

Sharples, M. (2019). Practical Pedagogy: 40 Ways to Teach and Learn . London: Rutledge.

Sharples, M., Adams, A., Alozie, N., Ferguson, F., FitzGerald, E., Gaved, M., et al. (2015). Innovating Pedagogy 2015 . Milton Keynes: Open University.

Sharples, M., Adams, A., Ferguson, R., Gaved, M., McAndrew, P., Rienties, B., et al. (2014). Innovating Pedagogy 2014 . Milton Keynes: Open University.

Sharples, M., de Roock, R., Ferguson, R., Gaved, M., Herodotou, C., Koh, E., et al. (2016). Innovating Pedagogy 2016: Open University Innovation Report 5 . Milton Keynes: The Open University.

Sharples, M., McAndrew, P., Weller, M., Ferguson, R., FitzGerald, E., Hirst, T., et al. (2012). Innovating Pedagogy 2012 . Milton Keynes: Open University.

Sharples, M., McAndrew, P., Weller, M., Ferguson, R., FitzGerald, E., Hirst, T., et al. (2013). Innovating Pedagogy 2013 . Milton Keynes: Open University.

Siemens, G. (2005). Connectivism: a learning theory for the digital age. Int. J. Instr. Technol. Distance Learn 2, 3–10. Available online at: https://web.archive.org/web/20190612101622/http://www.itdl.org/Journal/Jan_05/article01.htm

Silvertown, J., Harvey, M., Greenwood, R., Dodd, M., Rosewell, J., Rebelo, T., et al. (2015). Crowdsourcing the identification of organisms: a case-study of iSpot. ZooKeys 480, 125–146. doi: 10.3897/zookeys.480.8803

Sobel, D. (2004). Place-Based Education: Connecting Classrooms and Communities . Great Barrington, MA: Orion Society.

Staiff, R. (2016). Re-imagining Heritage Interpretation: Enchanting the Past-Future . London: Routledge. doi: 10.4324/9781315604558

Tempelaar, D. T., Rienties, B., and Giesbers, B. (2015). In search for the most informative data for feedback generation: learning analytics in a data-rich context. Comput. Hum. Behav. 47, 157–167. doi: 10.1016/j.chb.2014.05.038

Tempelaar, D. T., Rienties, B., Mittelmeier, J., and Nguyen, Q. (2018). Student profiling in a dispositional learning analytics application using formative assessment. Comput. Hum. Behav. 78, 408–420. doi: 10.1016/j.chb.2017.08.010

Theobald, P., and Curtiss, J. (2000). Communities as curricula. Forum Appl. Res. Public Policy 15, 106–111.

Thomas, G., and Munge, B. (2015). “Best practice in outdoor environmental education fieldwork: pedagogies to improve student learning,” in Experiencing the Outdoors , eds M. Robertson, G. Heath, and R. Lawrence (Brill Sense), 165–176. doi: 10.1007/978-94-6209-944-9_14

Thomas, G., and Munge, B. (2017). Innovative outdoor fieldwork pedagogies in the higher education sector: optimising the use of technology. J. Outdoor Environ. Educ. 20, 7–13. doi: 10.1007/BF03400998

Trevors, G., Feyzi-Behnagh, R., Azevedo, R., and Bouchet, F. (2016). Self-regulated learning processes vary as a function of epistemic beliefs and contexts: mixed method evidence from eye tracking and concurrent and retrospective reports. Learn. Instr. 42, 31–46. doi: 10.1016/j.learninstruc.2015.11.003

Trilling, B., and Fadel, C. (2009). 21st Century Skills: Learning for Life in Our Times. San Francisco: John Wiley & Sons.

von Glaserfeld, E. (1987). “Einführung in den radikalen Konstruktivismus,” in Wissen, Sprache und Wirklichkeit. Wissenschaftstheorie Wissenschaft und Philosophie , Vol. 24 (Wiesbaden: Vieweg+Teubner Verlag).

Vygotsky, L. S. (1978). Mind in Society. Cambridge, MA: Harvard University Press.

Webber, G., and Miller, D. (2016). Progressive pedagogies and teacher education: a review of the literature. McGill J. Educ. 51, 1061–1079. doi: 10.7202/1039628ar

Westlund, J. K., and Breazeal, C. (2015). “The interplay of robot language level with children's language learning during storytelling,” in Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts (Portland, OR: ACM). doi: 10.1145/2701973.2701989

Winne, P. H. (2017). Leveraging big data to help each learner upgrade learning and accelerate learning science. Teach. Coll. Rec. 119, 1–24.

Wood, D., Bruner, J. S., and Ross, G. (1979). The role of tutoring in problem solving. J. Child Psychol. Psychiatry 17, 89–100. doi: 10.1111/j.1469-7610.1976.tb00381.x

Wu, H.-K., Lee, S. W.-Y., Chang, H.-Y., and Liang, J.-C. (2013). Current status, opportunities and challenges of augmented reality in education. Comp. Educ. 62, 41–49. doi: 10.1016/j.compedu.2012.10.024

Zakaria, N. Y. K., Zaini, H., Hamdan, F., and Norman, H. (2018). Mobile game-based learning for online assessment in collaborative learning. Int. J. Eng. Technol. 7, 80–85. doi: 10.14419/ijet.v7i4.21.21620.

Zimmerman, B. J. (2000). “Attaining self-regulation: a social cognitive perspective,” in Handbook of Self-Regulation , eds M. Boekaerts, P. R. Pintrich, and M. Zeidner (San Diego, CA: Elsevier), 13–39. doi: 10.1016/B978-012109890-2/50031-7

Keywords: evidence-based practice, educational innovation, pedagogy, teaching and learning, educational effectiveness, educational theories, 21st century skills

Citation: Herodotou C, Sharples M, Gaved M, Kukulska-Hulme A, Rienties B, Scanlon E and Whitelock D (2019) Innovative Pedagogies of the Future: An Evidence-Based Selection. Front. Educ. 4:113. doi: 10.3389/feduc.2019.00113

Received: 01 June 2019; Accepted: 30 September 2019; Published: 11 October 2019.

Reviewed by:

Copyright © 2019 Herodotou, Sharples, Gaved, Kukulska-Hulme, Rienties, Scanlon and Whitelock. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Christothea Herodotou, christothea.herodotou@open.ac.uk

  • Research article
  • Open access
  • Published: 02 October 2020

Development of a new model on utilizing online learning platforms to improve students’ academic achievements and satisfaction

  • Hassan Abuhassna   ORCID: orcid.org/0000-0002-5774-3652 1 ,
  • Waleed Mugahed Al-Rahmi 1 ,
  • Noraffandy Yahya 1 ,
  • Megat Aman Zahiri Megat Zakaria 1 ,
  • Azlina Bt. Mohd Kosnin 1 &
  • Mohamad Darwish 2  

International Journal of Educational Technology in Higher Education volume  17 , Article number:  38 ( 2020 ) Cite this article

176k Accesses

111 Citations

7 Altmetric

Metrics details

This research aims to explore and investigate potential factors influencing students’ academic achievements and satisfaction with using online learning platforms. This study was constructed based on Transactional Distance Theory (TDT) and Bloom’s Taxonomy Theory (BTT). This study was conducted on 243 students using online learning platforms in higher education. This research utilized a quantitative research method. The model of this research illustrates eleven factors on using online learning platforms to improve students’ academic achievements and satisfaction. The findings showed that the students’ background, experience, collaborations, interactions, and autonomy positively affected students’ satisfaction. Moreover, effects of the students’ application, remembering, understanding, analyzing, and satisfaction was positively aligned with students’ academic achievements. Consequently, the empirical findings present a strong support to the integrative association between TDT and BTT theories in relation to using online learning platforms to improve students’ academic achievements and satisfaction, which could help decision makers in universities and higher education and colleges to plan, evaluate, and implement online learning platforms in their institutions.

Introduction

Higher education organizations over the previous two decades have offered full courses online as an integral part of their curricula, besides encouraging the completion throughout the online courses. Additionally, the number of students who are not participating in any courses online has continued to drop over the past few years. Similarly, it is perfectly possible to state that learning online is obviously an educational platform (Allen, Seaman, Poulin, & Straut, 2016 ). Courses online are trying to connect social networking components, experts’ content, because online resources are growing on daily basis. Such courses depend on active participation of a significant number of learners who participate independently in accordance with their education objectives, skills, and previous background and experience (McAuley, Stewart, Siemens, & Cormier, 2010 ). Nevertheless, learners differ in their previous background and experience, along with their education techniques, which clearly influence their online courses results besides their achievement (Kauffman, 2015 ). Consequently, despite the online learning evolution, learning online possibly will not be appropriate for each learner (Bouhnik & Carmi, 2013 ). Nevertheless, while online learning application among academic world has grown rapidly, not enough is identified regarding learners’ previous background and experience in learning online. Not so long ago, investigation concentrated on particular characteristics of learners’ experiences along with beliefs, for instance collaboration with their own instructor, online course quality, or studying with a certain learning management system (LMS) (Alexander & Golja, 2007 ; (Lester & King, 2009 ). Generally, limited courses or a single institution were investigated (Coates, James, & Baldwin, 2005 ; Lee, Yoon, & Lee, 2009 ). Few studies examined bigger sample sizes between one or more particular institutes (Alexander & Golja, 2007 ). Additionally, there is a shortage of researches that examine learners’ previous background and experience comparing face-to-face along with learning online elements, e.g., (Bliuc, Goodyear, & Ellis, 2007 ). The development of learners’ previous background and experience, skills, are realized to be the major advantages for administrative level for learning online.

Similarly, learners’ satisfaction and academic achievement towards learning online attracted considerable attention from scholars who employed several theoretical models in order to evaluate learners’ satisfaction and academic achievements (Abuhassna, Megat, Yahaya, Azlina, & Al-rahmi, 2020 ; Abuhassna & Yahaya, 2018 ; Al-Rahmi, Othman, & Yusuf, 2015a ; Al-Rahmi, Othman, & Yusuf, 2015b ). This present study highlights the effects of online learning platforms on student’s satisfaction, in relation to their background and prior experiences towards online learning platforms to identify learners that are going to be satisfied toward online course. Furthermore, this research explores the effects of transactional distance theory (TDT); student collaboration, student- instructor dialogue or communication, and student autonomy in relation to their satisfaction. Accordingly, this study investigates students’ academic achievements within online platforms, utilizing Bloom theory to measure students’ achievements through four main components, namely, understanding, remembering, applying, and analyzing. This study could have a significant influence on online course design and development. Additionally, this research may influence not only academic online courses but then other educational organizations according to the fact that several organizations offer training courses and solutions online. Both researchers and Instructors will be able to utilize and elaborate in accordance with the preliminary model, which was developed throughout this research, on the effects of online platforms on student’s satisfaction and academic achievements. Advantages of online learning and along with its applications were mentioned in earlier correlated literature (Abuhassna et al., 2020 ;Abuhassna & Yahaya, 2018 ; Al-Rahmi et al., 2018 ). However, despite the growing usage of online platforms, there is a shortage of employing this technology, which creates an issue in itself (Abuhassna & Yahaya, 2018 ; Al-Rahmi et al., 2018 ). Consequently, the research problem lies in the point that a model needs to be created to locate the significant evidence based on the data of student’s background, experiences and interactions within online learning environments which influence their academic performance and satisfaction. Thus, this developed model must be as a guidance for instructors and decision makers in the online education industry in terms of using online platforms to improve students learning experience through online platforms. Bearing in mind these conditions, our major problem was: how could we enhance students online learning experience in relation to both their academic achievements and satisfaction?

Research questions

The major research question that are anticipated to be answered is:

how could we enhance students online learning experience in relation to both their academic achievements and satisfaction?

To be able to answer this question, it is required to examine numerous sub-questions which have been stated as follow:

Q1: What is the relationship between students’ background and students’ satisfaction?

Q2: What is the relationship between students’ experience and students’ satisfaction?

Q3: What is the relationship between students’ collaboration and students’ satisfaction?

Q4: What is the relationship between students’ interaction and students’ satisfaction?

Q5: What is the relationship between students’ autonomy and students’ satisfaction?

Q6: What is the relationship between students’ satisfaction and students’ academic achievements?

Q7: What is the relationship between students’ application and students’ academic achievements?

Q8: What is the relationship between students’ remembering and students’ academic achievements?

Q9: What is the relationship between students’ understanding and students’ academic achievements?

Q10: What is the relationship between students’ analyzing and students’ academic achievements?

Research theory and hypotheses development

When designing web-courses within online learning instructions or mechanisms in general, educators are left with several decisions and considerations to face, which accordingly affect how students experience instruction, how they construct and process knowledge, how students could be satisfied through this experiment, and how web-based learning courses could enhance their academic achievements. In this study, we construct our theoretical framework according to Moore transactional distance theory (TDT) to measure student’s satisfaction, in addition to Bloom theory components to measure students’ academic achievements. Though the origins of TDT can be traced to the work of Dewey, it is Michael Moore who is identified as the innovator of this theory that first appeared in 1972. In his study and development of the theory, he acknowledged three main components of TDT that work as the base for much of the research on DL. Also, Bloom’s Taxonomy was established in 1956 under the direction of educational psychologist to measure students’ academic achievement (Bloom, Engelhart, Furst, Hill, & Krathwohl, 1956 ). TDT theory has been selected in this study since Transactional distance’s term indicates the geographical space between the student and instructor. Based on the learning understanding, which happens through learner’s interaction with his environment. This theory considers the role of each of these elements (Student’s autonomy, Dialogue, and class structure) whereas these three elements could help to investigate student’s satisfaction. Moore’s ( 1990 ) notion of ‘Transactional Distance’ adopt the distance that happens in all relations in education. The distance in the theory is mainly specified the dialogue’s amount which happens between the student and the teacher, and the structure’s amount in the course design. Which serves the main goal of this study as to enhance students online learning experience in relation to their satisfaction. Whereas, Bloom Theory has been selected in this study in addition to TDT to enhance students online learning experience in relation to their student’s achievements. In a conclusion both methods were implemented to develop and hypothesis this study hypothesis. See Fig.  1 .

figure 1

Research Model and Hypotheses

Hypothesis of the study

H1: There is a significant relationship between students’ background and students’ satisfaction.

H2: There is a significant relationship between students’ experience and students’ satisfaction.

H3: There is a significant relationship between students’ collaboration and students’ satisfaction.

H4: There is a significant relationship between students’ interaction and students’ satisfaction.

H5: There is a significant relationship between students’ autonomy and students’ satisfaction.

H6: There is a significant relationship between students’ satisfaction and students’ academic achievements.

H7: There is a significant relationship between students’ application and students’ academic achievements.

H8: There is a significant relationship between students’ remembering and students’ academic achievements.

H9: There is a significant relationship between students’ understanding and students’ academic achievements.

H10: There is a significant relationship between students’ analyzing and students’ academic achievements.

Hypothesis developments and literature review

This Section of the study will discuss the study hypothesis and relates each hypothesis to its related studies from the literature.

Students background toward online platforms

Students’ background regarding online platforms in this study is referred to as their readiness and willingness to use and adapt to different online platforms, providing them with the needed support and assistance. Students’ background towards online learning is a crucial component throughout this process, as prior research revealed that there are implementation issues, for instance; the deficiency of qualified lecturers, infrastructure and facilities, in addition to students’ readiness, besides students’ resistance to accept online learning platforms in addition to the Learning Management System (LMS) platforms, as educational tools (Azhari & Ming, 2015 ). However, student demand continued to increase, spreading to global audiences due to its exceptional functionality, flexibility and eventual accessibility (Azhari & Ming, 2015 ). There have been persistent apprehensions regarding online learning quality compared with traditional learning settings. In their research, (Paechter & Maier, 2010 ; Panyajamorn, Suthathip, Kohda, Chongphaisal, & Supnithi, 2018 ) have discovered that Austrian learners continue to prefer traditional learning environments due to communication goals, along with the interpersonal relations preservation. Moreover, (Lau & Shaikh, 2012 ) have discovered that Malaysian learners’ internet efficiency and computer skills, along with their personal demographics like gender, background, level of the study, as well as their financial income lead to a significant difference in their readiness towards online learning platforms. Abuhassna and Yahaya ( 2018 ) claimed that the current technologies in education play an essential role in providing a full online learning experience which is close enough to a face-to-face class in spite of the physical separation of the students from their educator, along with other students. Platforms of online learning lend themselves towards a less hierarchical methodology in education, fulfilling the learning desires of individuals which do not approach new information in a linear or a systematic manner. Platforms of online learning additionally are the most suitable ways for autonomous students (Abuhassna et al., 2020 ; Abuhassna & Yahaya, 2018 ; Paechter & Maier, 2010 ; Panyajamorn et al., 2018 ).

Students experience toward online platforms

Students’ experience in the current research indicates that learners must have prior experience in relation to utilizing online learning platform in their education settings. Thus, students experience towards online learning offers several advantages among themselves and their instructors in strengthening students’ learning experiences especially for isolated learners (Jaques & Salmon, 2007 ; Lau & Shaikh, 2012 ; Salmon, 2011 ; Salmon, 2014 ). Regardless of student recognition of the advantages towards supporting their learning throughout utilizing the technology, difficulties may occur through the boundaries about their technical capabilities and prior experiences towards utilizing the software itself from the perspective of its functionality. As demonstrated over learner’s experience and feedback from several online sessions over the years, this may frequently become a frustration source between both learners and their instructors, as this may make typically uncomplicated duties, for instance, watching a video, uploading a document, and other simple tasks to be progressively complicated for them, having no such prior experience. Furthermore, when filling out evaluations, for instance, online group presentations, the relatively limited capability to communicate face-to-face then to rely on a non-verbal signal along with audience’s body language might be a discouraging component. Nonetheless, the significance of being in a position to participate with other colleagues employing online sessions, which are occasionally nonvisual, for instance; teleconference format is a progressively significant skill in the modern workplace, thus affirming the importance of concise, clear, intensive interactions skills (Salmon, 2011 ; Salmon, 2014 ).

Student collaboration among themselves in online platforms

Students’ collaborations in the current study refers to the communication and feedback among themselves in online platforms. To refine and measure transactional distance using a survey tool, (Rabinovich, 2009 ) created a survey instrument to measure transactional distance in a higher education setting. A survey was sent to 235 students enrolled in a synchronous web-based graduate class in business regarding transactional distance and Collaborations (Rabinovich, 2009 ). The synchronous learning environment was described as a place where “live on-campus classes are conveyed simultaneously to both in-class students on campus and remote students on the Web who join via virtual classroom Web collaboration software” (Rabinovich, 2009 ). The virtual classroom software is similar to the characteristics of the two different software described by (Falloon, 2011 ; Mathieson, 2012 ) that it allows for students to interact with the educator and fellow students in real-time (Rabinovich, 2009 ). Moreover, (Kassandrinou, Angelaki, & Mavroidis, 2014 ) reported that the instructor plays a crucial role as interaction and communication helpers, as they are tasked with fostering, reassuring and assisting communication and interaction among students. Face-to-face tutorials have proven to be a vast opportunity for a multitude of students to interchange ideas, argue the content of the course and its related concerns (Vasala & Andreadou, 2010 ).

Students’ interactions with the instructor in online platforms

Purposeful interaction or (dialogue) in the current study describes communication that is learner-learner and learner-instructor which is designed to improve the understanding of the student. According to (Shearer, 2010 ) communication should also be constructive in that it builds upon ideas and work from others, as well as assists others in learning. (Moore, 1972 ) affirmed that learners also must realize that, and value the importance of the learning interactions as a vital part of the learning process. In a manner similar to (Benson & Samarawickrema, 2009 ] study of teacher preparatory students, (Falloon, 2011 ) investigated the use of digital tools in a case study at a teacher education program in New Zealand. (Mathieson, 2012 ) also explored the role dialogue plays in digital learning environments. She created a digital survey that examined students’ perception of audio-visual feedback in courses that utilize screen casting digital tools. (Moore, 2007 ) discusses autonomous learners searching for courses that do not stress structure and dialogue in order explain and enhance their learning progression. (Abuhassna et al., 2020 ; Abuhassna & Yahaya, 2018 ; Al-Rahmi et al., 2015b ; Al-Rahmi, Othman, & Yusuf, 2015d ; Furnborough, 2012 ) concluded that the feeling of cooperation that learners’ share with their fellow students effect their reaction concerning their collaboration with their peers.

Student autonomy in online platforms

Student autonomy in the current study refers to their independence and motivation towards learning. The learner is the motivation of the way toward learning, along with their expectations and requirements, thinking about everyone as a unique individual and hence investigating their own capacities and possibilities. Thus, extraordinary importance is attributed to autonomy in DL environments, since the option of instructive intercession offered in distance education empowers students towards learning autonomy (Massimo, 2014 ). In this respect, the connection between autonomy of student and explicit parts of the learning procedure are in the center of consideration as mentioned. (Madjar, Nave, & Hen, 2013 ) concluded that a learners’ autonomy-supportive environment provides these learners with adoption of a more aims guided learning, leading to more learning achievements. This is why autonomy is desired in the online settings for both individual development and greater achievement in academic environments. The researchers also indicate in their research that while autonomy supports outcomes in goals and aims guiding, educator practices mainly lead to goals which necessary cannot adapt. Thus, supportive-autonomy learning process needs to be designed with affective elements consideration as well. However, (Stroet, Opdenakker, & Minnaert, 2013 ) efficiently surveyed 71 experimental studies on the impacts of autonomy supportive teaching on motivation of learner and discovered a clear positive correlation. Similar to attribution theory, the relationship between learner control and inspiration involves the possibility of learners adjusting their own inspirations, for example, learners may be competent to change self-determined extrinsic motivation to intrinsic motivation. However, (Jacobs, Renandya, & Power, 2016 ) further indicated that learners will not reach the same level of autonomy without reviewing learner’s autonomy insights, reflecting on their learning experiences, sharing these experiences and reflections with other learners, and realizing the elements influencing all these processes, and the process of learning as well.

Student satisfaction in online platforms

Student satisfaction in the current study refers to the fact that there are many factors that play a role in determining the learner’s satisfaction, such as faculty, institution, individual learner element, interaction/communication elements, the course elements, and learning environment. Discussion of the elements also related to the role of the instructor, with the learner’s attitude, social presence, usefulness, and effectiveness of Online Platforms. (Yu, 2015 ) investigated that student satisfaction was positively associated with interaction, self-efficacy and self-regulation without significant gender variations. (Choy & Quek, 2016 ). examined the relationships between the learners’ perceived teaching, social, and cognitive element. In addition, satisfaction, academic performance, and achievement can be measured using a revised form of the survey instrument. (Kirmizi, 2014 ) studied connection between 6 psychosocial scales: personal relevance, educator assistance, student interaction and collaboration, student autonomy, authentic learning, along with active learning. A moderate level of correlation was found between these mentioned variables. Learner satisfaction predictors were educator support, personal relevance and authentic learning, while authentic learning was the only academic success predictor. Findings of (Bordelon, 2013 ) determined and described a positive correlation between both achievement and satisfaction. He demonstrated that the reasons behind these conclusions could be cultural variations in learner’s satisfaction which point out learning accession Zhu ( 2012 ). Scholars in the field of student satisfaction emphasis on the delivery besides the operational side of the student’s experience in the teaching process (Al-Rahmi, Othman, & Yusuf, 2015e ).

Students’ academic achievements in online platforms

Students achievements in this study refers to Bloom’s main four components of achievements, which are remembering, understanding, applying, and analyzing. Finding in a study conducted by (Whitmer, 2013 ) revealed the relationships between student academic achievement and the LMS usage, thus the findings showed a highly systematic association ( p  < .0000) in relation to every variable. These variables described 12% and 23% of variations within the final course marks, which indicates that learners who employed the LMS more often obtained higher marks than the others. Thus, the correlation techniques examined these variables separately to ascertain their association with the final mark. Moreover, it is not the technology itself; it is the educational methods in relation to which technology has been utilized that create a change in learners’ achievement. Instruments used are significant in identifying the technology impact, moreover, it is the implementation of those instruments under specific activities and for certain purposes which indicates whether or not they are effective. In contrast, a study conducted by (Barkand, 2017 ) revealed that LMS tools were not considered to have an effect on semester final grades when categorized by school year. In his study, semester final grades were a measure of student achievement, which has subjective elements. To account for the subjective elements in semester final grades, the study also included objective post test scores to evaluate student learning. Additionally, in this study, we refer to Bloom’s Taxonomy established in 1956 under the direction of educational psychologist for measuring students’ academic achievement (Bloom et al., 1956 ). Moreover, in this study, we selected fours domains of Blooms Taxonomy in order to achieve this study objectives, which are; application: which refers to using a concept in new context, for instance; applying what has been learned inside the classroom into different circumstances; remembering, which refers to recalling or retrieving prior learned knowledge; understanding, which refers to realizing the meaning, then clarification of problems instructions; analyzing, which refers to separating concepts or material into parts in such a way that its structure can be distinguished, understood among inferences and facts.

Students’ application

Applying involves “carrying out or using a procedure through executing or implementing” (Anderson & Krathwohl, 2001 ). Applying in this study refers to the student’s ability to use online platforms, such as how to log in, how to end session, how to download materials, how to access links and videos. Students can exchange information about a specific topic in online platforms such as Moodle, Google Documents, Wikis and apply knowledge to create and participate in online platforms.

Students’ remembering

Remembering is defined as “retrieving, recognizing, and recalling relevant knowledge from long-term memory” (Anderson & Krathwohl, 2001 ). In this study, remembering is referred to the ability to organize and remember online resources to easily find information on the internet. Moreover, students can easily cooperate with their colleagues and educator, contributing to the educational process and justifying their study procedure. Anderson and Krathwohl ( 2001 ) In their review of Bloom’s taxonomy, Anderson and Krathwohl ( 2001 ) recognized greater learning levels as creating, evaluating, and analyzing, with the lower learning levels as applying, understanding, and remembering.

Students’ understanding

Understanding involves “constructing meaning from oral, written, and graphic messages through interpreting, exemplifying, classifying, summarizing, inferring, comparing, and explaining” (Anderson & Krathwohl, 2001 ). In this study, understanding is referred to as understanding regarding a subject then putting forward new suggestions about online settings, for instance; understanding how e-learning works, or LMS. For example, students use online platforms to review concepts, courses, and prominent resources are being used inside the classroom environment.

Students’ analyzing

Analyzing includes “breaking material into constituent parts, determining how the parts relate to one another and to an overall structure or purpose through differentiating, organizing, and attributing” (Anderson & Krathwohl, 2001 ). Analyzing refers to the student’s ability to connect, discuss, mark-up, then evaluate the information received into one certain workplace or playground. Solomon and Schrum ( 2010 ) claim that educators have started employing online platforms for a range of activities, since they have become more familiar and there are ways for learners to benefit from using them. Generally, the purpose and goal are to publicize the development types, innovation, as well as additional activities that their learners usually do independently. Such instruments have also provided instructors ways to encourage and promote genuine cooperation in their project’s development (Solomon & Schrum, 2010 ).

Research methodology

A quantitative approach was implemented in this study to provide an inclusive insight in relation to students online learning experience and how to enhance both their satisfaction and academic achievements using a questionnaire. Two experts were referred for the evaluation of the questionnaire’s content. Before the collection of the data, permission regarding the current research purpose has been obtained from Universiti Teknologi Malaysia (UTM). In relation to the sampling and population, this research was conducted among undergraduate learners who have been online learning users. Learners, who had manually obtained the questionnaires, have been requested to fill in their details, then fill their own assessments regarding online learning platforms and its effects towards their academic achievements. Thus, for data analysis, the data that were attained from questionnaires were then analyzed using the Statistical Package for the Social Sciences (SPSS). Specifically, Structural Equation Modeling (SEM- Amos), which has been employed as a primary data analysis tool. Moreover, utilizing SEM-Amos process involves two main phases: evaluating construct validity, the convergent validity, along with the discriminant validity of the measurements; then analyzing the structural model. These mentioned two phases followed the recommendations of (Bagozzi, Yi, & Nassen, 1998; Hair, Sarstedt, Ringle, & Mena, 2012a , 2012b ).

Sample characteristics and data collection

A total of 283 questionnaires were distributed manually; of these, only 264, which make up 93.3% of the total number, were returned to the authors. Excluding the 26 incomplete questionnaires, 264 were evaluated employing SPSS. A total of 21 questionnaires have been excluded: 14 were incomplete and 7 having outliners. Thus, the overall number of valid questionnaires was 243 following this exclusion. This exclusion step is being supported by Hair et al. ( 2012a , 2012b ) . Moreover, Venkatesh, Thong, & Xu, 2012 who pointed out that this procedure is essential to be implemented as the existence of outliers could be a reason for inaccurate results. Regarding the respondent’s demographic details: 91 (37.4%) were males, and 152 (62.6%) were females. 149 (61.3%) were in the age range of 18 t0 20 years old, 77 (31.7%) were in the age range of 21 to 24 years old, and 17 (7.0%) were in the age range of 25 to 29 years old. Regarding level of study: 63 (25.9%) were from level 1, 72 (29.6%) were from level 2, 50 (20.6%) were from level 3, and 58 (23.9%) were from level 4.

Measurement instruments

The questionnaire in this study has been developed to fit the study hypothesis. Consequently, it was developed based into both theories that have been utilized in this study. The questionnaire has two main sections, first section aims to measure student satisfaction which is based on the TDT theory variables. Second section of the questionnaire has been developed to measure students’ academic achievement based on Bloom theory. According to Bloom theory there are four variables that measure students’ achievements, which are application, remembering, understanding, analyzing. On that basis the questionnaire has been developed to measure both students’ satisfaction and academic achievements . The construct items were adapted to ensure content validity. This questionnaire consisted of two main sections. First part covered the demographic details of the respondents’ including age, gender, educational level. The second part comprises 51 items which were adapted from previous researches as following; student background, five items, student experience, five items adapted from (Akaslan & Law, 2011 ), student collaborations, and, student interactions items adapted from (Bolliger & Inan, 2012 ), student autonomy, five items adapted from (Barnard et al., 2009 ; Pintrich, Smith, Garcia, & McKeachie, 1991 ), student satisfaction, six items adapted from (The blended learning impact evaluation at UCF is conducted by Research Initiative for Teaching Effectiveness, n.d. ). Moreover, effects of the students’ application, four items, students’ remembering, four items, students’ understanding, four items, students’ analyzing, four items, and students’ academic achievements, four items adapted from (Pekrun, Goetz, & Perry, 2005 ). The questionnaire has been distributed to the students after taking the online course.

Result and analysis

Cronbach’s Alpha reliability coefficient result was 0.917 among all research model factors. Thus, the discriminant validity (DV) assessment was carried out through utilizing three criteria, which are: index between variables, which is expected to be less than 0.80 (Bagozzi, Yi, & Nassen, 1988 ); each construct AVE value must be equal to or higher than 0.50; square of (AVE) between every construct should be higher, in value, than the inter construct correlations (IC) associated with the factor [49]. Furthermore, the crematory factor analysis (CFA) findings along with factor loading (FL) should therefore be 0.70 or above although the Cronbach’s Alpha (CA) results are confirmed to be ≥0.70 [50]. Researchers have also added that composite reliability (CR) is supposed to be ≥0.70.

Model analysis

Current research employed AMOS 23 to analyze the data. Both structural equation modeling (SEM) as well as confirmatory factor analysis (CFA) have been employed as the main analysis tools. Uni-dimensionality, reliability, convergent validity along with discriminant validity have been employed to assess the measurement model. (Bagozzi et al., 1988 ; Byrne, 2010 ; Kline, 2011 ) highlighted that goodness-of-fit guidelines, such as the normed chi-square, chi-square/degree of freedom, normed fit index (NFI), relative fit index (RFI), Tucker-Lewis coefficient (TLI) comparative fit index (CFI), incremental fit index (IFI), the parsimonious goodness of fit index (PGFI), thus, the root mean square error of approximation (RMSEA) besides the root mean-square residual (RMR). All these are tools which could be utilized as the assessment procedures for the model estimation. See Table  1 & Fig.  2 .

figure 2

Measurement Model

Measurement model

Such type of validity is commonly employed to specify the size difference between a concept and its indicators and other concepts (Hair et al., 2012a , 2012b ). Through analysis in this context, discriminant validity has proven to be positive over all concepts given that values have been over 0.50 (cut-off value) from p  = 0.001 according to Fornell and Larcker ( 1981 ). In line with Hair et al. ( 2012a , 2012b ) . Bagozzi, Yi, & Nassen, (1998), the correlation between items at any two specified constructs must not exceed the square root of the average variance that is shared between them in a single construct. The outcomes values of composite reliability (CR) besides those of Cronbach’s Alpha (CA) remained about 0.70 and over, while the outcomes of the average variance extracted (AVE) remained about 0.50 and higher, indicating that all factor loadings (FL) were significant, thereby fulfilling conventions in the current assessment Bagozzi, Yi, & Nassen, (1998), and Byrne ( 2010 ). Following sections expand on the results of the measurement model. Findings of validity, reliability, average variance extracted (AVE), composite reliability (CR) as well as Cronbach’s Alpha (CA) have all been accepted, which also demonstrated determining the discriminant validity. It is determined that all the values of (CR) vary between 0.812 and 0.917, meaning they are above the cut-off value of 0.70. The (CA) result values also varied between 0.839 and 0.897 exceeding the cut-off value of 0.70. Thus, the (AVE) was similarly higher than 0.50, varying between 0.610 and 0.684. All these findings are positive, thus indicating significant (FLs) and they comply with the conventional assessment guidelines Bagozzi, Yi, & Nassen, (1998), along with Fornell and Larcker ( 1981 ). See Table  2 and Additional file  1 .

Structural model analysis

In the current study, the path modeling analysis has been utilized to examine the impact of students’ academic achievements among higher education institutions through the following factors (students’ background, students’ experience, students’ collaborations, students’ interaction, students’ autonomy, students’ remembering, students’ understanding, students’ analyzing, students’ application, students’ satisfaction), which is based on online learning. The findings are displayed then compared in hypothesis testing discussion. Subsequently, as the second stage, factor analysis (CFA) has being conducted on structural equation modeling (SEM) in order to assess the proposed hypotheses as demonstrated in Fig.  3 .

figure 3

Findings for the Proposed Model Path analysis

As shown in both Figs.  3 and 4 , all hypotheses have been accepted. Moreover, Table  3 below shows that the fundamental statistics of the model was good, which indicates model validity along with the testing results of the hypotheses through demonstrating the values of unstandardized coefficients besides standard errors of the structural model.

figure 4

Findings for the Proposed Model T.Values

The first direct five assumptions, students’ background, students’ experience, students’ collaborations, students’ interaction; students’ autonomy with students’ satisfaction, were addressed. In accordance with Fig.  4 and Table 3 , relations between students’ background and students’ satisfaction was (β = .281, t = 5.591, p  < 0.001), demonstrating that the first hypothesis (H1) has suggested a positive and significant relationship. Following hypothesis illustrated the relationship between students’ experience and students’ satisfaction (β = .111, t = 1.951, p  < 0.001), demonstrating that the second hypothesis (H2) proposed a positive and significant relationship. Third hypothesis illustrated the relationship between students’ collaborations and students’ satisfaction (β = .123, t = 2.584, p  < 0.001) demonstrating that the third hypothesis (H3) has suggested a positive and significant relationship. Additionally, the relationship between students’ background and students’ satisfaction was (β = .116, t = 2.212, p < 0.001), indicating that the fourth hypothesis (H4) has suggested a positive and significant relationship. Further to the above-mentioned findings, the relationship between students’ autonomy and students’ satisfaction was (β = .470, t = 7.711, p  < 0.001), demonstrating that the fifth hypothesis (H5) has suggested a positive and significant relationship. Moreover, in the second section, five assumptions were discussed, which are students’ satisfaction, students’ remembering, students’ understanding, students’ analyzing, students’ application along with students’ academic achievements.

As shown in Fig. 4 and Table 3 , the association between students’ satisfaction and students’ academic achievements was (β = .135, t = 3.473, p  < 0.001), demonstrating that the sixth hypothesis (H6) has suggested a positive and significant relationship. Following hypothesis indicated the relationship between students’ application and students’ academic achievements (β = .215, t = 6.361, p  < 0.001), indicating that the seventh hypothesis (H7) has suggested a positive and significant relationship. Thus, the eighth hypothesis indicated the relationship between students’ remembering and students’ academic achievements was (β = .154, t = 4.228, p  < 0.001), demonstrating that the eight hypothesis (H8) has suggested a positive and significant relationship. Additionally, the correlation between students’ understanding and students’ academic achievements was (β = .252, t = 6.513, p < 0.001), demonstrating that the ninth hypothesis (H9) has suggested a positive and significant relationship. Finally, the relationship between students’ analyzing and students’ academic achievements was (β = .179, t = 6.215, p < 0.001), demonstrating that the tenth hypothesis (H10) has suggested a positive and significant relationship. Accordingly, this current model demonstrated student’s compatibility to use online learning platforms to improve students’ academic achievements and satisfaction. This is in accordance with earlier investigations (Abuhassna & Yahaya, 2018 ; Al-Rahmi et al., 2018 ; Al-rahmi, Othman, & Yusuf, 2015c ; Barkand, 2017 ; Madjar et al., 2013 ; Salmon, 2014 ).

Discussion and implications

Developing a new hybrid technology acceptance model through combining TDT and BTT has been the major objective of the current research, which aimed to investigate the guiding factors towards utilizing online learning platforms to improve students’ academic achievements and satisfaction in higher education institutions. The current research is intensifying a step forward by implementing TDT along with a BTT model. Using the proposed model, the current research examined how students’ background, students’ experience, students’ collaborations, students’ interactions, and students’ autonomy positively affected students’ satisfaction. Moreover, effects of the students’ application, students’ remembering, students’ understanding, students’ analyzing, and students’ satisfaction positively affected students’ academic achievements. The current research found that students’ background, students’ experience, students’ collaborations, students’ interactions, and students’ autonomy were influenced by students’ satisfaction. Also, effects of the students’ application, students’ remembering, students’ understanding, students’ analyzing, and students’ satisfaction positively affected students’ academic achievements. This conclusion is consistent with earlier correlated literature. Thus, this reveals that learners first make sure whether using platforms of online learning were able to meet their study requirements, or that using platforms of online learning are relevant to their study process before considering employing such technology in their study. Learners have been noted to perceive that platforms of online learning is more useful only once they discover that such a technology is actually better than the traditional learning which does not include online learning platforms (Choy & Quek, 2016 ; Illinois Online Network, 2003 ). Using the proposed model, the current research examined how to improve students’ academic achievements and satisfaction. Thus, the following section will be a comparison between this study results and previous research, as follows.

The first hypotheses of this study demonstrated a positive and significant association between students’ prior background towards online platforms with their satisfaction. As clearly investigated in Osika and Sharp ( 2002 ) study, numerous learners deprived of these main skills enroll in the courses, struggle, and subsequently drop out. In addition, Bocchi, Eastman, and Swift ( 2004 ) investigation claimed that prior knowledge of students’ concerns, demands along with their anticipations is crucial in constructing an efficient instruction. Thus, to clarify, students must have prior knowledge and background before letting them into the online platforms. On the other hand, there are constant concerns about the online learning platforms quality in comparison to a face-to-face learning environment, as students do not have the essential skills required toward using online learning platforms (Illinois Online Network, 2003 ). Moreover, a study by Alalwan et al. ( 2019 ) discovered that Austrian learners still would rather choose face-to-face learning for communication purposes, and the preservation of interpersonal relations. This is due to the fact that learners do not as yet have the background knowledge and skills needed towards using online learning platforms. Additional research by Orton-Johnson ( 2009 ) among UK learners claimed that learners have not accepted online materials, and continue to prefer traditional context materials as the medium for their learning, which also indicates the importance of prior knowledge and background towards online platforms before going through such a technology.

The second hypotheses of this study proposed a positive and significant association between students’ experience along with students’ satisfaction, which revealed that putting the students in such an experience would provide and support them with the ability to overcome all difficulties that arise through the limits around the technical ability of the online platforms. This is in line with some earlier researches regarding the reasons that lead to people’s technology acceptance behavior. One reason is the notion of “conformity,” which means the degree to which an individual take into consideration that an innovation is consistent with their existing demands, experiences, values and practices (Chau & Hu, 2002 ; Moore & Benbasat, 1991 ; Rogers, 2003 ; Taylor & Todd, 1995 ). Moreover, (Anderson & Reed, 1998 ; Galvin, 2003 ; Lewis, 2004 ) claimed that most students who had prior experience with online education tended to exhibit positive attitudes toward online education, and it affects their attitudes toward online learning platforms.

The third hypotheses of this study demonstrated a positive and significant association among student collaboration with themselves in online platforms, which indicates the key role of collaboration between students in order to make the experiment more realistic and increase their ability to feel more involved and active. This is agreement with Al-rahmi, Othman, and Yusuf ( 2015f ) who claimed that type, quality, and amount of feedback that each student received was correlated to a student’s sense of success or course satisfaction. Moreover, Rabinovich ( 2009 ) found that all types of dialogue were important to transactional distance, which make it easier for the student to adapt to online learning platform. Also, online learning platforms enable learners to share then exchange information among their colleagues Abuhassna et al., 2020 ; Abuhassna & Yahaya, 2018 ).

Students’ interaction with the instructor in online platforms

The fourth hypothesis of this study proposed a positive and significant correlation between students’ collaborations and students’ satisfaction, which indicates the significance of the communication between students and their instructor throughout the online platforms experiment. These results agree with (Mathieson, 2012 ) results, which stated that the ability of communication between students and their instructor lowered the sense of separation between learner and educator. Moreover, in line with (Kassandrinou et al., 2014 ), communication guides learners to undergo constructive emotions, for example relief, satisfaction and excitement, which assist them to achieve their educational goals. In addition, (Furnborough, 2012 ) draws conclusion that learners’ feeling of cooperating with their fellow students effects their reaction concerning their collaboration with their peers. Moreover, Kassandrinou et al., 2014 focused on the instructor as crucial part as interaction and communication helpers, as they are thought to constantly foster, reassure and assist communication and interaction amongst students.

Student’s autonomy in online platforms

The fifth hypotheses of this study proposed a positive and significant relationship between student’s autonomy and online learning platforms, which indicates that students need a sense of dependence towards online platforms, which agrees with Madjar et al. ( 2013 ) who concluded that a learners’ autonomy-supportive environment provides these learners with adoption of more aims, leading to more learning achievements. Moreover, Stroet et al. ( 2013 ) found a clear positive correlation on the impacts of autonomy supportive teaching on motivation of learner. O’Donnell, Chang, and Miller ( 2013 ) also argues that autonomy is the ability of the learners to govern themselves, especially in the process of making decisions and setting their own course and taking responsibility for their own actions.

Student’s satisfaction in online platforms

The sixth hypotheses of this study proposed a positive and significant correlation between student’s satisfaction with online learning platforms, which indicates a level of acceptance by the students to adapt into online learning platforms. This is in agreement with Zhu ( 2012 ) who reported that student’s satisfaction in online platforms is a statement of confidence with the system. Moreover, Kirmizi ( 2014 ) study revealed that the predictors of the learners’ satisfaction were educator’s support, personal relevance and authentic learning, whereas the authentic learning is only the predictor of academic success. Furthermore, the findings of Bordelon ( 2013 ) stated and determined a positive correlation between both satisfaction and achievement. In addition, the results of Mahle ( 2011 ) clarified that student satisfaction occurs when it is realized that the accomplishment has met the learners’ expectations, which is then considered a short-term attitude toward the learning procedure.

Hypotheses seven, eight, nine and ten of this study proposed a positive and significant relationship between student’s academic achievements with online learning platforms, which indicates the key main role of online platform with students’ academic achievements. This agrees with Whitmer ( 2013 ) findings, which revealed that the associations between student usage of the LMS and academic achievement exposed a highly systematic relationship. In contrast, Barkand ( 2017 ) found that there is no significant difference in students’ academic achievements in utilizing online platforms regarding students’ academic achievements, which is due to the fact that academic achievement towards online learning platforms requires a certain set of skills and knowledge as mentioned in the above sections in order to make such technology a success.

The seventh hypotheses of this study proposed a positive and significant correlation between students’ application and students’ academic achievements, which indicates the major key of applying in the learning process as an effected element. This is in line with the Computer Science Teachers’ Association (CSTA) taskforce in the U. S (Computer Science Teachers’ Association (CSTA), 2011 ), where they mentioned that applying elements of computer skills is essential in all state curricula, directing to their value for improving pupils’ higher order thinking in addition to general problem-solving abilities. Moreover, Gouws, Bradshaw, and Wentworth ( 2013 ) created a theoretical framework which drawn education computational thoughts compared to cognitive levels established from Bloom’s Taxonomy of Learning Purposes. Four thinking skill levels have been utilized to assess the ‘cognitive demands’ initiated by computational concepts for instance abstraction, modelling, developing algorithms, generating automated processes. Through the iPad app, LightBot. thinking skills remained recognizing (which means recognize and recall expertise correlating to the problem); Understanding (interpret, compare besides explain the problem); whereas, applying (make use of computer skills to create a solution) then Assimilating (critically decompose and analyses the problem).

The eighth hypotheses of this study proposed a positive and significant correlation between students’ remembering and students’ academic achievements, which indicates the importance of remembering as a process of retrieving information relating to what needed to be done and/or outcome attributes) over the procedure of learning according to Bloom’s Taxonomy of Educational Objectives. Additionally, Falloon ( 2016 ) claimed that responding to data indicated the use of general thinking skills to clarify and understand steps and stages needed to complete a task (average 29%); recalling or remembering information about a task or available tools (average 13%); and discussing and understanding success criteria (average 3%).

The ninth hypotheses of this study proposed a positive and significant correlation between students’ understanding and students’ academic achievements, which indicates its significance with the academic achievements as a process of criticizing the task or the problem faced by the students into phases or activities to help understanding of how to resolve the problem. The current results agree with Falloon ( 2016 ) who demonstrated the necessity to build understanding over the thinking processes employed by students once they are engaged in their work. In addition, Falloon ( 2016 ) suggested that the purpose and nature of questioning was broader than this, with questioning of self and others being an important strategy in solution development. In many respects, the questioning for those students was not much a perspective, although more a practice, to the degree that assisted them to understand their tasks, analyze intended or developed explanations and to evaluate their outcomes.

The tenth hypotheses of this study proposed a positive and significant correlation between students’ understanding and students’ academic achievements, which reveals the importance of analysis as a process of employing general thinking besides computational knowledge in order to realize the challenges through using online platforms, in addition to predictive thinking to categorize, explore and fix any possible errors throughout the whole process. Falloon ( 2016 ) claimed that analyzing was often a collaborative procedure between pairs receiving and giving counseling from others to assist in solving complications. On the other hand, online learning platforms are highly dependent on connecting and sharing as a basic strategy that needs to be employed over all stages of online learning settings, whether between students and students, or between students and their instructor. Moreover, Falloon ( 2016 ) findings showed that Analyzing (average 17%) was present in various phases of these online students’ work, which is based on what phase they were at together with their tasks, despite the fact that most analysis was associated with students depending on themselves during online process.

Conclusion and future work

In this investigation, both transactional distance theory (TDT) and Bloom’s Taxonomy theory (BTT) have been validated in the educational context, providing further understanding towards the students’ prospective perceptions on using online learning platforms to improve students’ academic achievement and satisfaction. The contribution that the current research might have to the field of online learning platforms have been discussed and explained. Additional insights towards students’ satisfactions and students’ academic achievements have also been presented. The current research emphasizes that the incorporation of both TDT and BTT can positively influence the research outcome. The current research has determined that numerous stakeholders, for instance developers, system designers, along with institutional users of online learning platforms reasonably consider student demands and needs, then ensure that the such a system is effectively meeting their requirements and needs. Adoption among users of online learning platforms could be broadly clarified by the eleven factor features which is based on this research model. Thus, the current research suggests more investigation be carried out to examine relationships among the complexity of online learning platforms combined with technology acceptance model (TAM).

Recommendations for stakeholders of online platforms

Based on the study findings, the first recommendation would be for administrators of higher institution. In order to implement online learning, there must be more interest given to the course structure design, whereas it should be based on theories and prior literature. Moreover, instructor and course developer need to be trained and skilled to achieve online learning platforms goals. Workshops and training sessions must be given for both instructors and students to make them more familiar in order to take the most advantages of the learning management system like Moodle and LMS. The software itself is not enough for creating an online learning environment that is suitable for students and instructors. If instructors were not trained and unaware of utilizing the software (e.g. Moodle) in the class, then the quality of education imparted to students will be jeopardized. Training and assessing the class instructor and making modifications to the software could result in a good environment for the instructor and a quality education for the student. Both students’ satisfaction and academic achievements depends on their prior knowledge and experience in relation to online learning. This current research intended to investigate student satisfaction and academic achievements in relation to online learning platforms in on of the higher education in Malaysia. Future research could integrate more in relation to blended learning settings.

Availability of data and materials

All the hardcopy questionnaires, data and statistical analysis are available.

Abuhassna, H., Megat, A., Yahaya, N., Azlina, M., & Al-rahmi, W. M. (2020). Examining Students' satisfaction and learning autonomy through web-based courses. International Journal of Advanced Trends in Computer Science and Engineering , 1 (9), 356–370. https://doi.org/10.30534/ijatcse/2020/53912020 .

Article   Google Scholar  

Abuhassna, H., & Yahaya, N. (2018). Students’ utilization of distance learning through an interventional online module based on Moore transactional distance theory. Eurasia Journal of Mathematics, Science and Technology Education , 14 (7), 3043–3052. https://doi.org/10.29333/ejmste/91606 .

Akaslan, D., & Law, E. L.-C. (2011). Measuring student E-learning readiness: A case about the subject of Electricity in Higher Education Institutions in Turkey. In H. Leung, E. Popescu, Y. Cao, R. W. H. Lau, & W. Nejdl (Eds.), ICWL 2011. LNCS, vol. 7048 , (pp. 209–218). Heidelberg: Springer.

Google Scholar  

Alalwan, N., Al-Rahmi, W. M., Alfarraj, O., Alzahrani, A., Yahaya, N., & Al-Rahmi, A. M. (2019). Integrated three theories to develop a model of factors affecting students’ academic performance in higher education. IEEE Access , 7 , 98725–98742.

Alexander, S., & Golja, T. (2007). Using students' experiences to derive quality in an e-learning system: An institution's perspective. Educational Technology & Society , 10 (2), 17–33.

Allen, I. E., Seaman, J., Poulin, R., & Straut, T. T. (2016). Online report card: Tracking online education in the United States. Babson survey research group and the online learning consortium (OLC), Pearson, and WCET state authorization Network .

Al-Rahmi, W., Othman, M. S., & Yusuf, L. M. (2015b). The role of social media for collaborative learning to improve academic performance of students and researchers in Malaysian higher education. The International Review of Research in Open and Distributed Learning , 16 (4). http://www.irrodl.org/index.php/irrodl/article/view/2326 . https://doi.org/10.19173/irrodl.v16i4.2326 .

Al-Rahmi, W. M., Alias, N., Othman, M. S., Alzahrani, A. I., Alfarraj, O., Saged, A. A., & Rahman, N. S. A. (2018). Use of e-learning by university students in Malaysian higher educational institutions: A case in Universiti Teknologi Malaysia. IEEE Access , 6 , 14268–14276.

Al-Rahmi, W. M., Othman, M. S., & Yusuf, L. M. (2015a). The effectiveness of using e-learning in Malaysian higher education: A case study Universiti Teknologi Malaysia. Mediterranean Journal of Social Sciences , 6 (5), 625–625.

Al-rahmi, W. M., Othman, M. S., & Yusuf, L. M. (2015c). Using social media for research: The role of interactivity, collaborative learning, and engagement on the performance of students in Malaysian post-secondary institutes. Mediterranean Journal of Social Sciences , 6 (5), 536.

Al-Rahmi, W. M., Othman, M. S., & Yusuf, L. M. (2015d). Exploring the factors that affect student satisfaction through using e-learning in Malaysian higher education institutions. Mediterranean Journal of Social Sciences , 6 (4), 299.

Al-Rahmi, W. M., Othman, M. S., & Yusuf, L. M. (2015e). Effect of engagement and collaborative learning on satisfaction through the use of social media on Malaysian higher education. Res. J. Appl. Sci., Eng. Technol , 9 (12), 1132–1142.

Anderson, D. K., & Reed, W. M. (1998). The effects of internet instruction, prior computer experience, and learning style on teachers’ internet attitudes and knowledge. Journal of Educational Computing Research , 19 (3), 227–246. https://doi.org/10.2190/8WX1-5Q3J-P3BW-JD61 .

Anderson, L. W., & Krathwohl, D. R. (Eds.) (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives . New York: Longman.

Azhari, F. A., & Ming, L. C. (2015). Review of e-learning practice at the tertiary education level in Malaysia. Indian Journal of Pharmaceutical Education and Research , 49 (4), 248–257.

Bagozzi, R. P., Yi, Y., & Nassen, K. D. (1988). Representation of measurement error in marketing variables: Review of approaches and extension to three-facet designs. Elsevier. Journal of Econometrics , 89 (1–2), 393–421.

Barkand, J. M. (2017). Using educational data mining techniques to analyze the effect of instructors' LMS tool use frequency on student learning and achievement in online secondary courses. Available from ProQuest Dissertations & Theses Global. Retrieved from https://vpn.utm.my/docview/2007550976?accountid=41678

Barnard, L., Lan, W. Y., To, Y. M, Paton, V. O., & Lai, S. L. (2009). Measuring self-regulation in online and blended learning environments. The Internet and Higher Education , 12 (1), 1–6. https://doi.org/10.1016/j.iheduc.2008.10.005 .

Benson, R., & Samarawickrema, G. (2009). Addressing the context of e-learning: Using transactional distance theory to inform design. Distance Education Journal , 30 (1), 5–21.

Bliuc, A. M., Goodyear, P., & Ellis, R. A. (2007). Research focus and methodological choices in studies into students' experiences of blended learning in higher education. The Internet and Higher Education , 10 , 231–244.

Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives, handbook I: The cognitive domain . New York: David McKay Co Inc.

Bocchi, J., Eastman, J. K., & Swift, C. O. (2004). Retaining the online learner: Profile of students in an online MBA program and implications for teaching them. Journal of Education for Business , 79 (4), 245–253.

Bolliger, D. U., & Inan, F. A. (2012). Development and validation of the online student connectedness survey (OSCS). The International Review of Research in Open and Distributed Learning , 13 (3), 41–65. https://doi.org/10.19173/irrodl.v13i3.1171 .

Bordelon, K. (2013). Perceptions of achievement and satisfaction as related to interactions in online courses (PhD dissertation) . Northcentral University.

Bouhnik, D., & Carmi, G. (2013). Thinking styles in virtual learning courses , (p. 141e145). Toronto: Proceedings of the 2013 international conference on information society (i-society) Retrieved from: http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber¼6619545 .

Byrne, B. M. (2010). Structural equation modeling with AMOS: Basic concepts, applications, and programming , (2nd ed., ). New York: Routledge.

Chau, P. Y. K., & Hu, P. J. (2002). Examining a model of information technology acceptance by individual professionals: An exploratory study. Journal of Management Information System , 18 (4), 191–229.

Choy, J. L. F., & Quek, C. L. (2016). Modelling relationships between students’ academic achievement and community of inquiry in an online learning environment for a blended course. Australasian Journal of Educational Technology , 32 (4), 106–124 https://doi.org/10.14742/ajet.2500 .

Coates, H., James, R., & Baldwin, G. (2005). A critical examination of the effects of learning management systems on university teaching and learning. Tertiary Education and Management , 11 , 19–36.

Computer Science Teachers’ Association (CSTA). (2011) The computational thinking leadership toolkit. [Online] Available from: http://www.csta.acm.org/Curriculum/sub/CompThinking.html [Accessed 13 Jan 2020].

Falloon, G. (2011). Exploring the virtual classroom: What students need to know (and teachers should consider). Journal of online learning and teaching. , 7 (4), 439–451.

Falloon, G. W. (2016). An analysis of young students’ thinking when completing basic coding tasks using scratch Jnr. On the iPad. Journal of Computer-Assisted Learning , 32 , 576–379.

Fornell, C., & Larcker, D. F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research , 18 (1), 39–50. https://doi.org/10.2307/3151312 .

Furnborough, C. (2012). Making the most of others: Autonomous interdependence in adult beginner distance language learners. Distance Education , 33 (1), 99–116. https://doi.org/10.1080/01587919.2012.667962 .

Galvin, T. (2003). The (22nd Annual) 2003. Industry report. Training , 40 (9), 19–45.

Gouws, L., Bradshaw, K., & Wentworth, P. (2013). Computational thinking in educational activities. In J. Carter, I. Utting, & A. Clear (Eds.), The proceedings of the 18th conference on innovation and Technology in Computer Science Education , (pp. 10–15). Canterbury: ACM.

Hair, J. F., Sarstedt, M., Ringle, C. M., & Mena, J. A. (2012a). An assessment of the use of partial least squares structural equation modeling in marketing research. Journal of the Academy of Marketing Science. , 40 (3), 414–433.

Illinois Online Network. 2003. Learning styles and the online environment. Illinois Online Network and the Board of Trustees of the University of Illinois, http://illinois.online.uillinois.edu/IONresources/instructionaldesign/learningstyles.html

Jacobs, G. M., Renandya, W. A., & Power, M. (2016). Learner autonomy. In G. Jacobs, W. A. Renandya, & M. Power (Eds.), Simple, powerful strategies for student centered learning . New York: Springer International Publishing. https://doi.org/10.1007/978-3-319-25712-9_3 .

Chapter   Google Scholar  

Jaques, D., & Salmon, G. (2007). Learning in groups: A handbook for face-to-face and online environments . Abingdon: Routledge.

Book   Google Scholar  

Kassandrinou, A., Angelaki, C., & Mavroidis, I. (2014). Transactional distance among Open University students. How does it affect the learning Progress? European journal of open. Distance and e-Learning , 16 (1), 78–93.

Kauffman, H. (2015). A review of predictive factors of student success in and satisfaction with online learning. Research in Learning Technology , 23 , 1e13. https://doi.org/10.3402/rlt.v23.26507 .

Kirmizi, O. (2014). A Study on the Predictors of Success and Satisfaction in an Online Higher Education Program in Turkey. International Journal of Education , 6 , 4.

Kline, R. B. (2011). Principles and practice of structural equation modeling , (3rd ed., ). New York: The Guilford Press.

MATH   Google Scholar  

Lau, C. Y., & Shaikh, J. M. (2012). The impacts of personal qualities on online learning readiness at Curtin Sarawak Malaysia (CSM). Educational Research and Reviews , 7 (20), 430–444.

Lee, B. C., Yoon, J. O., & Lee, I. (2009). Learners' acceptance of e-learning in South Korea: Theories and results. Computers & Education , 53 , 1320–1329.

Lester, P. M., & King, C. M. (2009). Analog vs. digital instruction and learning: Teaching within first and second life environments. Journal of Computer-Mediated Communication , 14 , 457–483.

Lewis, N. (2004). Military student participation in distance learning . Doctorate dissertation. Johnson & Wales University. USA.

Madjar, N., Nave, A., & Hen, S. (2013). Are teachers’ psychological control, autonomy support and autonomy suppression associated with students’ goals? Educational Studies , 39 (1), 43–55. https://doi.org/10.1080/03055698.2012.667871 .

Mahle, M. (2011). Effects of interaction on student achievement and motivation in distance education. Quarterly Review of Distance Education , 12 (3), 207–215, 222.

Massimo, P. (2014). Multidimensional analysis applied to the quality of the websites: Some empirical evidences from the Italian public sector. Economics and Sociology , 7 (4), 128–138. https://doi.org/10.14254/2071-789X.2014/7-4/9 .

Mathieson, K. (2012). Exploring student perceptions of audiovisual feedback via screen casting in online courses. American Journal of Distance Education , 26 (3), 143–156.

McAuley, A., Stewart, B., Siemens, G., & Cormier, D. (2010). The MOOC model for digital practice (created through funding received by the University of Prince Edward Island through the social sciences and humanities research Council's “knowledge synthesis Grants on the digital economy”) .

Moore, G. C., & Benbasat, I. (1991). Development of an instrument to measure the perception of adopting an information technology innovation. Information System Research , 2 (3), 192–223.

Moore, M. (1990). Background and overview of contemporary American distance education. In M. Moore (Ed.) Contemporary issues in American distance education.

Moore, M. G. (1972). Learner autonomy: The second dimension of independent learning .

Moore, M. G. (2007). Theory of transactional distance. In M. G. Moore (Ed.), Handbook of distance education . Lawrence Erlbaum Associates.

O’Donnell, S. L., Chang, K. B., & Miller, K. S. (2013). Relations among autonomy, attribution style, and happiness in college students. College Student Journal .

Orton-Johnson, K. (2009). ‘I’ve stuck to the path I’m afraid’: Exploring student non-use of blended learning. British Journal of Educational Technology , 40 (5), 837–847.

Osika, R. E., & Sharp, D. P. (2002). Minimum technical competencies for distance learning students. Journal of Research on Technology in Education , 34 (3), 318–325.

Paechter, M., & Maier, B. (2010). Online or face-to-face? Students’ experiences and preferences in e-learning. Internet and Higher Education , 13 (4), 292–297.

Panyajamorn, T., Suthathip, S., Kohda, Y., Chongphaisal, P., & Supnithi, T. (2018). Effectiveness of E learning design and affecting variables in Thai public schools. Malaysian Journal of Learning and Instruction , 15 (1), 1–34.

Pekrun, R., Goetz, T., & Perry, P. R. (2005). Academic Emotions Questionnaire (AEQ): User's Manual . Munich: University of Munich, Department of Psychology; University of Manitoba Retrieved February 21, 2017. Available online at: https://de.scribd.com/doc/217451779/2005-AEQ-Manual# (Accessed 17 July 2019.

Pintrich, P. R., Smith, D. A. F., Garcia, T., & McKeachie, W. J. (1991). A manual for the use of the motivated strategies for learning questionnaire (MSLQ) . Ann Arbor: The University of Michigan.

Rabinovich, T. (2009). Transactional distance in a synchronous web-extended classroom learning environment . Unpublished doctoral dissertation. Massachusetts: Boston University.

Rogers, E. M. (2003). Diffusion of innovations , (5th ed., ). New York: Free Press.

Salmon, G. (2011). E-moderating: The key to teaching and learning online , (3rd ed., ). London: Routledge.

Salmon, G. (2014). Learning innovation: A framework for transformation. European Journal of Open, Distance and e-Learning , 17 (1), 219–235.

Shearer, R. L. (2010). Transactional distance and dialogue: An exploratory study to refine the theoretical construct of dialogue in online learning. Dissertation Abstracts International Section A , 71 , 800.

Solomon, G., & Schrum, L. (2010). Web 2.0 how-to for educators .

Stroet, K., Opdenakker, M. C., & Minnaert, A. (2013). Effects of need supportive teaching on early adolescents’ motivation and engagement: A review of the literature. Educational Research Review , 9 , 65–87.

Taylor, S., & Todd, P. A. (1995). Assessing IT usage: The role of prior experience. MIS Quarterly , 19 (2), 561–570.

The blended learning impact evaluation at UCF is conducted by Research Initiative for Teaching Effectiveness. (n.d.) https://digitallearning.ucf.edu/learning-analytics/ . Accessed 25 Feb 2020.

Vasala, P., & Andreadou, D. (2010). Student’s support from tutors and peer students in distance learning. Perceptions of Hellenic Open University “studies in education” postgraduate program graduates. Open Education – The Journal for Open and Distance Education and Educational Technology , 6 (1–2), 123–137 (in Greek with English abstract).

Venkatesh, V., Thong, J. Y., & Xu, X. (2012). Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology. MIS Quarterly , 36 (1), 157–178.

Whitmer J.C. (2013). Logging on to improve achievement: Evaluating the relationship between use of the learning management system, student characteristics, and academic achievement in a hybrid large enrollment undergraduate course. Doctorate dissertation, university of California. USA.

Yu, Z. (2015). Indicators of satisfaction in clickers aided EFL class. Frontiers in Psychology , 6 , 587 https://www.frontiersin.org/articles/10.3389/fpsyg.2015.00587/full .

Zhu, C. (2012). Student satisfaction, performance, and knowledge construction in online collaborative learning. Educational Technology & Society , 15 (1), 127–136.

Download references

Acknowledgements

Not applicable.

Declarations

The study involved both undergraduate and graduate students at unviersiti teknologi Malaysia (UTM), an ethical approve was taken before collecting any data from the participants

Author information

Authors and affiliations.

Faculty of Social Sciences & Humanities, School of Education, Universiti Teknologi Malaysia, UTM, 81310, Skudai, Johor, Malaysia

Hassan Abuhassna, Waleed Mugahed Al-Rahmi, Noraffandy Yahya, Megat Aman Zahiri Megat Zakaria & Azlina Bt. Mohd Kosnin

Faculty of Engineering, School of Civil Engineering, Universiti Teknologi Malaysia, UTM, 81310, Skudai, Johor, Malaysia

Mohamad Darwish

You can also search for this author in PubMed   Google Scholar

Contributions

The corresponding author worked in writing the paper, collecting the data, the second author done all the statistical analysis. Moreover, all authors worked collaboratively to write the literature review and discussion and read and approved the final manuscript.

Corresponding author

Correspondence to Hassan Abuhassna .

Ethics declarations

Competing interests.

This paper is an original work, as its main objective is to develop a model to enhance students’ satisfaction and academic achievement towards using online platforms. As Universiti teknologi Malaysia (UTM) implementing a fully online courses starting from the second semester of 2020.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1..

General objective of the study

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Abuhassna, H., Al-Rahmi, W.M., Yahya, N. et al. Development of a new model on utilizing online learning platforms to improve students’ academic achievements and satisfaction. Int J Educ Technol High Educ 17 , 38 (2020). https://doi.org/10.1186/s41239-020-00216-z

Download citation

Received : 10 March 2020

Accepted : 19 May 2020

Published : 02 October 2020

DOI : https://doi.org/10.1186/s41239-020-00216-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online learning platforms
  • Students’ achievements
  • student’s satisfaction
  • Transactional distance theory (TDT)
  • Bloom’s taxonomy theory (BTT)

education model research paper

Advertisement

Advertisement

Large language models in education: A focus on the complementary relationship between human teachers and ChatGPT

  • Published: 02 May 2023
  • Volume 28 , pages 15873–15892, ( 2023 )

Cite this article

education model research paper

  • Jaeho Jeon   ORCID: orcid.org/0000-0002-1161-3676 1 &
  • Seongyong Lee   ORCID: orcid.org/0000-0002-9436-4272 2  

12k Accesses

56 Citations

4 Altmetric

Explore all metrics

Artificial Intelligence (AI) is developing in a manner that blurs the boundaries between specific areas of application and expands its capability to be used in a wide range of applications. The public release of ChatGPT, a generative AI chatbot powered by a large language model (LLM), represents a significant step forward in this direction. Accordingly, professionals predict that this technology will affect education, including the role of teachers. However, despite some assumptions regarding its influence on education, how teachers may actually use the technology and the nature of its relationship with teachers remain under-investigated. Thus, in this study, the relationship between ChatGPT and teachers was explored with a particular focus on identifying the complementary roles of each in education. Eleven language teachers were asked to use ChatGPT for their instruction during a period of two weeks. They then participated in individual interviews regarding their experiences and provided interaction logs produced during their use of the technology. Through qualitative analysis of the data, four ChatGPT roles (interlocutor, content provider, teaching assistant, and evaluator) and three teacher roles (orchestrating different resources with quality pedagogical decisions, making students active investigators, and raising AI ethical awareness) were identified. Based on the findings, an in-depth discussion of teacher-AI collaboration is presented, highlighting the importance of teachers’ pedagogical expertise when using AI tools. Implications regarding the future use of LLM-powered chatbots in education are also provided.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

education model research paper

Similar content being viewed by others

education model research paper

Incorporating AI in foreign language education: An investigation into ChatGPT’s effect on foreign language learners

Fatih Karataş, Faramarz Yaşar Abedi, … Yasemin Kuzgun

education model research paper

ChatGPT: Empowering lifelong learning in the digital age of higher education

education model research paper

Empowering learners with ChatGPT: insights from a systematic literature exploration

Laila Mohebi

Data availability

The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.

Altman, S. (2022, December 5). Twitter . https://twitter.com/sama/status/1599668808285028353?s=20&t=j5ymf1tUeTpeQuJKlWAKaQ

Bibauw, S., François, T., & Desmet, P. (2019). Discussing with a computer to practice a foreign language: Research synthesis and conceptual framework of dialogue-based CALL. Computer Assisted Language Learning, 32 (8), 827–877.

Article   Google Scholar  

Bibauw, S., François, T., Van den Noortgate, W., & Desmet, P. (2022). Dialogue systems for language learning: A meta-analysis. Language Learning & Technology, 26 (1), 1–24.

Google Scholar  

Bower, M. (2019). Technology-mediated learning theory. British Journal of Educational Technology, 50 (3), 1035–1048. https://doi.org/10.1111/bjet.12771

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3 (2), 77–101.

Brown, T. B., Mann, B., Ryder, N., Subbiah, M., Kaplan, J., Dhariwal, P., Neelakantan, A., Shyam, P., Sastry, G., Askell, A., Agarwal, S., Herbert-Voss, A., Krueger, G., Henighan, T., Child, R., Ramesh, A., Ziegler, D. M., Wu, Jeffrey, Winter, C. …, & Amodei, D. (2020). Language Models are Few-Shot Learners. arXiv. https://arxiv.org/abs/2005.14165

Celik, I. (2023). Towards Intelligent-TPACK: An empirical study on teachers’ professional knowledge to ethically integrate artificial intelligence (AI)-based tools into education. Computers in Human Behavior, 138 , 107468. https://doi.org/10.1016/j.chb.2022.107468

Celik, I., Dindar, M., Muukkonen, H., & Jarvela, S. (2022). The promises and challenges of artificial intelligence for teachers: A systematic review of research. TechTrends, 66 , 616–630. https://doi.org/10.1007/s11528-022-00715-y

Choi, S., Jang, Y., & Kim, H. (2023a). Influence of Pedagogical Beliefs and Perceived Trust on Teachers’ Acceptance of Educational Artificial Intelligence Tools. International Journal of Human–Computer Interaction, 39 (4), 910–922. https://doi.org/10.1080/10447318.2022.2049145

Choi, S., Jang, Y., & Kim, H. (2023b). Exploring factors influencing students’ intention to use intelligent personal assistants for learning. Interactive Learning Environments . https://doi.org/10.1080/10494820.2023.2194927

Creswell, J. W. (2008). Educational research: Planning, conducting, and evaluating quantitative and qualitative approaches to research (3rd ed.). Merrill/Pearson Education.

Dillenbourg, P., & Jermann, P. (2010). Technology for classroom orchestration. In M. S. Khine & I. M. Saleh (Eds.), New science of learning (pp. 525–552). Springer Science + Business Media.

Chapter   Google Scholar  

Dizon, G. (2020). Evaluating intelligent personal assistants for L2 listening and speaking development. Language Learning & Technology, 24 (1), 16–26.

MathSciNet   Google Scholar  

Fryer, L., Coniam, D., Carpenter, R., & Lăpușneanu, D. (2020). Bots for language learning now: Current and future directions. Language Learning & Technology, 24 (2), 8–22.

Fryer, L. K., Nakao, K., & Thompson, A. (2019). Chatbot learning partners: Connecting learning experiences, interest and competence. Computers in Human Behavior, 93 (December 2018), 279–289. https://doi.org/10.1016/j.chb.2018.12.023

Heidt, A. (2023). ‘Arms race with automation’: Professors fret about AI-generated coursework. Nature . https://www.nature.com/articles/d41586-023-00204-z

Hew, K. F., Huang, W., Du, J., & Jia, C. (2023). Using chatbots to support student goal setting and social presence in fully online activities: Learner engagement and perceptions. Journal of Computing in Higher Education, 35, 40-68.  https://doi.org/10.1007/s12528-022-09338-x . 

Holstein, K., Aleven, V., & Rummel, N. (2020). A Conceptual Framework for Human–AI Hybrid Adaptivity in Education. In: I. Bittencourt, M., Cukurova, K., Muldner, R., Luckin, & E., Millán (Eds.), Artificial Intelligence in Education. AIED 2020. Lecture Notes in Computer Science (vol. 12163). Springer, Cham. https://doi.org/10.1007/978-3-030-52237-7_20

Holstein, K., & Aleven, V. (2022). Designing for human–AI complementarity in K-12 education. AI Magazine, 43 (2), 239–248. https://doi.org/10.1002/aaai.12058

Holstein, K., McLaren, B. M., & Aleven, V. (2019). Co-designing a real-time classroom orchestration tool to support teacher–ai complementarity. Journal of Learning Analytics, 6 (2), 27–52. https://doi.org/10.18608/jla.2019.62.3

Huang, X., Zou D., Cheng, G., Chen, X., & Xie, H. (2023). Trends, research issues and applications of artificial intelligence in language education. Educational Society & Technology 26 (1), 112–131. https://www.j-ets.net/collection/forthcoming-articles/26_1

Huang, W., Hew, K. F., & Fryer, L. K. (2022). Chatbots for language learning—Are they really useful? A systematic review of chatbot-supported language learning. Journal of Computer Assisted Learning, 38 , 237–257. https://doi.org/10.1111/jcal.12610

Hwang, G. J., Xie, H., Wah, B. W., & Gasevic, D. (2020). Vision, challenges, roles, and research issues of Artificial Intelligence in Education. Computers and Education: Artificial Intelligence, 1 , 100001. https://doi.org/10.1016/j.caeai.2020.100001

Jeon, J. (2021). Chatbot-assisted dynamic assessment (CA-DA) for L2 vocabulary learning and diagnosis. Computer Assisted Language Learning . https://doi.org/10.1080/09588221.2021.1987272

Jeon, J. (2022a). Exploring a self-directed interactive app for informal EFL learning: a self-determination theory perspective. Education and Information Technologies, 27 (4), 5767–5787. https://doi.org/10.1007/s10639-021-10839-y

Jeon, J. (2022b). Exploring AI chatbot affordances in the EFL classroom: young learners’ experiences and perspectives. Computer Assisted Language Learning.  https://doi.org/10.1080/09588221.2021.2021241

Jeon, J., Lee, S., & Choe, H. (2022). Enhancing EFL pre-service teachers’ affordance noticing and utilizing with the Synthesis of Qualitative Evidence strategies: An exploratory study of a customizable virtual environment platform. Computers & Education, 190 , 104620. https://doi.org/10.1016/j.compedu.2022.104620

Ji, H., Han, I., & Ko, Y. (2023). A systematic review of conversational AI in language education: focusing on the collaboration with human teachers. Journal of Research on Technology in Education, 55 (1), 48–63. https://doi.org/10.1080/15391523.2022.2142873 . Advance online publication.

Kasneci, E., Sessler, K., Kuchemann, S., Bannert, M., Dementieva, D., Fischer, F., Gasser, U., Groh, G., Gunnemann, S., Hullermeier, E., Krusche, S., Kutyniok, G., Michaeli, T., Nerdel, C., Jurgen, P., Poquet, O., Sailer, M., Schmidt, A., Seidel, T. …, & Kasneci, G. (2023). ChatGPT for good? On opportunities and challenges of large language models for education. Learning and Individual Differences, 103 , 102274. https://doi.org/10.1016/j.lindif.2023.102274

Kim, J., Lee, H., & Cho, Y. H. (2022). Learning design to support student-AI collaboration: Perspectives of leading teachers for AI in education. Education and Information Technologies, 27 , 6069–6104. https://doi.org/10.1007/s10639-021-10831-6

Kuhail, M. A., Alturki, N., Alramlawi, S., & Alhejori, K. (2022). Interacting with educational chatbots: A systematic review. Education and Information Technologies . https://doi.org/10.1007/s10639-022-11177-3 . Advanced online publication.

Lee, S., & Jeon, J. (2022). Visualizing a disembodied agent: Young EFL learners’ perceptions of voice-controlled conversational agents as language partners. Computer Assisted Language Learning . https://doi.org/10.1080/09588221.2022.2067182

Lombard, M., Snyder-Duch, J., & Bracken, C. C. (2002). Content analysis in mass communication: Assessment and reporting of intercoder reliability. Human Communication Research, 28 (4), 587–604. https://doi.org/10.1111/j.1468-2958.2002.tb00826.x

Luckin, R., Cukurova, M., Kent, C., & du Boulay, B. (2022). Empowering educators to be AI-ready. Computers and Education: Artificial Intelligence, 3 , 100076. https://doi.org/10.1016/j.caeai.2022.100076

OpenAI. (2023). ChatGPT . OpenAI. https://chat.openai.com/chat

Stokel-Walker, C. (2023). AI Bot ChatGPT writes smart essays--should professors worry? Nature . https://doi.org/10.1038/d41586-022-04397-7 . https://www.nature.com/articles/d41586-022-04397-7

Susnjak, T. (2022). ChatGPT: The end of online exam integrity? arXiv. https://doi.org/10.48550/arXiv.2212.09292

Timpe-Laughlin, V., Sydorenko, T., & Daurio, P. (2022) Using spoken dialogue technology for L2 speaking practice: what do teachers think? Computer Assisted Language Learning, 35 (5–6), 1194–1217. https://doi.org/10.1080/09588221.2020.1774904

Wang, X., Liu, Q., Pang, H., Tan, S. C., Lei, J., Wallace, M. P., & Li, L. (2023). What matters in AI-supported learning: A study of human-AI interactions in language learning using cluster analysis and epistemic network analysis. Computers & Education, 194 , 104703. https://doi.org/10.1016/j.compedu.2022.104703

Weizenbaum, J. (1966). ELIZA—A computer program for the study of natural language communication between man and machine. Communications of the ACM, 9 (1), 36–45.

Xu, W., & Ouyang, F. A. (2022). A systematic review of AI role in the educational system based on a proposed conceptual framework. Education and Information Technologies, 27 , 4195–4223. https://doi.org/10.1007/s10639-021-10774-y

Zawacki-Richter, O., Marin, V. I., Bond, M., & Gouverneur, F. (2019). Systematic review of research on artificial intelligence applications in higher education–where are the educators? International Journal of Educational Technology in Higher Education, 16 (1), 39. https://doi.org/10.1186/s41239-019-0171-0

Zhai, X. (2022). ChatGPT User Experience: Implications for Education. arXiv. https://doi.org/10.2139/ssrn.4312418

Zhang, Z., Xu, Y., Wang, Y., Yao, B., Ritchie, D., Wu, T., Mo, Y., Wang, D., & Li, T. J. J. (2022). StoryBuddy: A human-AI collaborative chatbot for parent–child interactive storytelling with flexible parental involvement. In CHI ’22: CHI conference on human factors in computing systems (pp. 1–21). https://doi.org/10.1145/3491102.3517479

Download references

Author information

Authors and affiliations.

Department of Literacy, Culture, and Language Education, Indiana University, 107 S. Indiana Avenue, Bloomington, IN, 47405-7000, USA

Department of English Education, Hannam University, 70 Hannam-Ro, Daedeok-Gu, Daejeon, 34430, Republic of Korea

Seongyong Lee

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Seongyong Lee .

Ethics declarations

Conflicts of interest, additional information, publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Jeon, J., Lee, S. Large language models in education: A focus on the complementary relationship between human teachers and ChatGPT. Educ Inf Technol 28 , 15873–15892 (2023). https://doi.org/10.1007/s10639-023-11834-1

Download citation

Received : 21 February 2023

Accepted : 17 April 2023

Published : 02 May 2023

Issue Date : December 2023

DOI : https://doi.org/10.1007/s10639-023-11834-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Large language model
  • Human–computer interaction
  • Artificial intelligence
  • Large language model-powered chatbot
  • Find a journal
  • Publish with us
  • Track your research

Explainable Student Performance Prediction Models: A Systematic Review

Ieee account.

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

IMAGES

  1. Sample MLA Research Paper

    education model research paper

  2. (PDF) PEDAGOGY IN HIGHER EDUCATION

    education model research paper

  3. (PDF) DESIGNING A MODEL OF RESEARCH PAPER WRITING INSTRUCTIONAL

    education model research paper

  4. 🎉 Research paper about education sample. Education Research Paper. 2022

    education model research paper

  5. FREE 40+ Research Paper Samples in PDF

    education model research paper

  6. Educational Research Examples

    education model research paper

VIDEO

  1. Hindi Model Paper 2024 Class 12 Bihar Board

  2. Business Model Research Pitch Competition

  3. 8th SCIENCE MODEL PAPER

  4. Foundation of Education model question for B.Ed 1st year

  5. மாதிரி ஆராய்ச்சி கட்டுரைகளில் இருந்து எவ்வாறு எழுத கற்றுக்கொள்வது? Learning

  6. Models of teacher education

COMMENTS

  1. Systems Research in Education: Designs and methods

    This exploratory paper seeks to shed light on the methodological challenges of education systems research. There is growing consensus that interventions to improve learning outcomes must be designed and studied as part of a broader system of education, and that learning outcomes are affected by a complex web of dynamics involving different inputs, actors, processes and socio-political contexts.

  2. Full article: Understanding inclusive education

    In developing a theoretical model useful to research inclusive education, we used institutional theory to interpret the concept of inclusion. In 1966, Berger and Luckmann described institutionalisation as '[t]he social construction of reality' and claimed that people socially construct institutions through their everyday communication with ...

  3. Re-Thinking an Educational Model Suitable for 21st Century Needs

    Measuring and Comparing Achievements of Learning Outcomes in Higher Education in Europe (CALOHEE), an EU funded project envisages a new model. The paper will partly be based on the (initial) findings of this project. International cooperation in the context of the EHEA is essential to engage all, and make a change.

  4. Artificial intelligence in education: The three paradigms

    The research questions are what are the different roles of AI in education, how AI are connected to the existing educational and learning theories, and to what extent the use of AI technologies influence learning and instruction. In order to locate and summarize relevant papers, the systematic procedures of literature selection and ...

  5. PDF Large Language Models in Education: Vision and Opportunities

    This paper is a systematic summary and analysis of the re-search background, motivation, and applications of educational large models. By reviewing existing research, we provide an in-depth understanding of the potential and challenges of edu-cational large models for education practitioners, researchers,

  6. Innovative Pedagogies of the Future: An Evidence-Based Selection

    Introduction. In its vision for the future of education in 2030, the Organization for Economic Co-operation and Development views essential learner qualities as the acquisition of skills to embrace complex challenges and the development of the person as a whole, valuing common prosperity, sustainability and wellbeing.Wellbeing is perceived as "inclusive growth" related to equitable access ...

  7. Technology acceptance model in educational context: A systematic

    A number of reviews and meta‐analysis focused on specific topics related to technology acceptance in education have been conducted. The Technology Acceptance Model (TAM) is the key model in ...

  8. Development of a new model on utilizing online learning platforms to

    This research aims to explore and investigate potential factors influencing students' academic achievements and satisfaction with using online learning platforms. This study was constructed based on Transactional Distance Theory (TDT) and Bloom's Taxonomy Theory (BTT). This study was conducted on 243 students using online learning platforms in higher education. This research utilized a ...

  9. Systems models in educational research: a review and realignment in the

    A revised model developed for curriculum research is presented. The paper would be of interest to those undertaking education-focused research, scholarly teaching practitioners as well as those with an interest in the use of systems models as a framework for educational alignment.

  10. Mathematical modeling for theory-oriented research in educational

    Mathematical modeling describes how events, concepts, and systems of interest behave in the world using mathematical concepts. This research approach can be applied to theory construction and testing by using empirical data to evaluate whether the specific theory can explain the empirical data or whether the theory fits the data available. Although extensively used in the physical sciences and ...

  11. AI technologies for education: Recent research & future directions

    Research must be data-supported empirical studies. Articles that were solely based on personal opinions or anecdotal experiences were excluded; 3. Research must have investigated educational effects of AI by reporting relevant qualitative or quantitative data. Papers that did not provide any evidence on learning were excluded; 4.

  12. Extending the Technology Acceptance Model to Explore Students

    Technology Acceptance Model (TAM; Davis, 1986, 1989), as one of the most influential frameworks for the exploration of issues regarding technology acceptance and rejection, has been increasingly used in teaching and learning contexts (Al-Emran et al., 2018).The strength of TAM has been confirmed by numerous studies and the model has evolved to become the common ground theory in understanding ...

  13. PDF Competency-based Education: Theory and Practice

    Unfortunately, nations are still in quest of providing an education pursuing this goal. One of these quests is Competency-based education. As Gervais (2016) stated, Competency-based education (CBE) is "a synthesis between a liberal arts education and the professional education movement." Briefly, it is the redesign of the

  14. Research Papers in Education

    Journal overview. Research Papers in Education has developed an international reputation for publishing significant research findings across the discipline of education. The distinguishing feature of the journal is that we publish longer articles than most other journals, to a limit of 12,000 words. We particularly focus on full accounts of ...

  15. Research in Education: Sage Journals

    Research in Education provides a space for fully peer-reviewed, critical, trans-disciplinary, debates on theory, policy and practice in relation to Education. International in scope, we publish challenging, well-written and theoretically innovative contributions that question and explore the concept, practice and institution of Education as an object of study.

  16. PDF An instructional design model with the cultivating research-based

    Full Length Research Paper An instructional design model with the cultivating research-based learning strategies for fostering teacher students' creative thinking abilities Khwanchai Khuana, Tanthip Khuana and Toansakul Santiboon* Department of Innovative Teaching Plans and Instruction Faculty of Education, Kamphaeng Phet Rajabhat University,

  17. Large language models in education: A focus on the ...

    Artificial Intelligence (AI) is developing in a manner that blurs the boundaries between specific areas of application and expands its capability to be used in a wide range of applications. The public release of ChatGPT, a generative AI chatbot powered by a large language model (LLM), represents a significant step forward in this direction. Accordingly, professionals predict that this ...

  18. PDF Designing for Engagement: Using the ADDIE Model to Integrate High ...

    Research Libraries' Framework for Information Literacy for Higher Education. This process may be useful for other librarians who teach online or face-to-face instruction in one-shot or in more extended instructional interactions. LIB250: Introduction to Library Research and Technology in the Information Age

  19. Review of Educational Research: Sage Journals

    Review of Educational Research. The Review of Educational Research (RER) publishes critical, integrative reviews of research literature bearing on education, including conceptualizations, interpretations, and syntheses of literature and scholarly work in a field broadly relevant to … | View full journal description.

  20. Explainable Student Performance Prediction Models: A ...

    Successful prediction of student performance has significant impact to many stakeholders, including students, teachers and educational institutes. In this domain, it is equally important to have accurate and explainable predictions, where accuracy refers to the correctness of the predicted value, and explainability refers to the understandability of the prediction made. In this systematic ...

  21. Research Papers in Education: Vol 39, No 2 (Current issue)

    A structured discussion of the fairness of GCSE and A level grades in England in summer 2020 and 2021. et al. Article | Published online: 18 Feb 2024. Explore the current issue of Research Papers in Education, Volume 39, Issue 2, 2024.

  22. PDF Concepts, Models, and Research of Extended Education

    The purpose of this paper is threefold. First, this study set out to investigate the terms that indicate extended education in each country and region. The cases of nine countries ... S. H. Bae: Concepts, Models, and Research of Extended Education 157 of educational equality and excellence, as the mainstream formal education system does. ...

  23. (PDF) Finland Education System

    FINLAND EDUCATION SYSTEM. Ashok Federick. Finlandia University,Finlad. Email: [email protected]. Abstract. If viewed from a geographical perspective, Finland is a Scandinavian country in ...