• Multi-Tiered System of Supports Build effective, district-wide MTSS
  • School Climate & Culture Create a safe, supportive learning environment
  • Positive Behavior Interventions & Supports Promote positive behavior and climate
  • Family Engagement Engage families as partners in education
  • Platform Holistic data and student support tools
  • Integrations Daily syncs with district data systems and assessments
  • Professional Development Strategic advising, workshop facilitation, and ongoing support

Mesa OnTime

  • Surveys and Toolkits

book-supporting every student 18 interventions

18 Research-Based MTSS Interventions

Download step-by-step guides for intervention strategies across literacy, math, behavior, and SEL.

  • Connecticut
  • Massachusetts
  • Mississippi
  • New Hampshire
  • North Carolina
  • North Dakota
  • Pennsylvania
  • Rhode Island
  • South Carolina
  • South Dakota
  • West Virginia
  • Testimonials
  • Success Stories
  • About Panorama
  • Data Privacy
  • Leadership Team
  • In the Press
  • Request a Demo

Request a Demo

  • Popular Posts
  • Multi-Tiered System of Supports
  • Family Engagement
  • Social-Emotional Well-Being
  • College and Career Readiness

Show Categories

School Climate

45 survey questions to understand student engagement in online learning.

Nick Woolf

In our work with K-12 school districts during the COVID-19 pandemic, countless district leaders and school administrators have told us how challenging it's been to  build student engagement outside of the traditional classroom. 

Not only that, but the challenges associated with online learning may have the largest impact on students from marginalized communities.   Research   suggests that some groups of students experience more difficulty with academic performance and engagement when course content is delivered online vs. face-to-face.

As you look to improve the online learning experience for students, take a moment to understand  how students, caregivers, and staff are currently experiencing virtual learning. Where are the areas for improvement? How supported do students feel in their online coursework? Do teachers feel equipped to support students through synchronous and asynchronous facilitation? How confident do families feel in supporting their children at home?

Below, we've compiled a bank of 45 questions to understand student engagement in online learning.  Interested in running a student, family, or staff engagement survey? Click here to learn about Panorama's survey analytics platform for K-12 school districts.

Download Toolkit: 9 Virtual Learning Resources to Engage Students, Families, and Staff

45 Questions to Understand Student Engagement in Online Learning

For students (grades 3-5 and 6-12):.

1. How excited are you about going to your classes?

2. How often do you get so focused on activities in your classes that you lose track of time?

3. In your classes, how eager are you to participate?

4. When you are not in school, how often do you talk about ideas from your classes?

5. Overall, how interested are you in your classes?

6. What are the most engaging activities that happen in this class?

7. Which aspects of class have you found least engaging?

8. If you were teaching class, what is the one thing you would do to make it more engaging for all students?

9. How do you know when you are feeling engaged in class?

10. What projects/assignments/activities do you find most engaging in this class?

11. What does this teacher do to make this class engaging?

12. How much effort are you putting into your classes right now?

13. How difficult or easy is it for you to try hard on your schoolwork right now?

14. How difficult or easy is it for you to stay focused on your schoolwork right now?

15. If you have missed in-person school recently, why did you miss school?

16. If you have missed online classes recently, why did you miss class?

17. How would you like to be learning right now?

18. How happy are you with the amount of time you spend speaking with your teacher?

19. How difficult or easy is it to use the distance learning technology (computer, tablet, video calls, learning applications, etc.)?

20. What do you like about school right now?

21. What do you not like about school right now?

22. When you have online schoolwork, how often do you have the technology (laptop, tablet, computer, etc) you need?

23. How difficult or easy is it for you to connect to the internet to access your schoolwork?

24. What has been the hardest part about completing your schoolwork?

25. How happy are you with how much time you spend in specials or enrichment (art, music, PE, etc.)?

26. Are you getting all the help you need with your schoolwork right now?

27. How sure are you that you can do well in school right now?

28. Are there adults at your school you can go to for help if you need it right now?

29. If you are participating in distance learning, how often do you hear from your teachers individually?

For Families, Parents, and Caregivers:

30 How satisfied are you with the way learning is structured at your child’s school right now?

31. Do you think your child should spend less or more time learning in person at school right now?

32. How difficult or easy is it for your child to use the distance learning tools (video calls, learning applications, etc.)?

33. How confident are you in your ability to support your child's education during distance learning?

34. How confident are you that teachers can motivate students to learn in the current model?

35. What is working well with your child’s education that you would like to see continued?

36. What is challenging with your child’s education that you would like to see improved?

37. Does your child have their own tablet, laptop, or computer available for schoolwork when they need it?

38. What best describes your child's typical internet access?

39. Is there anything else you would like us to know about your family’s needs at this time?

For Teachers and Staff:

40.   In the past week, how many of your students regularly participated in your virtual classes?

41. In the past week, how engaged have students been in your virtual classes?

42. In the past week, how engaged have students been in your in-person classes?

43. Is there anything else you would like to share about student engagement at this time?

44. What is working well with the current learning model that you would like to see continued?

45. What is challenging about the current learning model that you would like to see improved?

Elevate Student, Family, and Staff Voices This Year With Panorama

Schools and districts can use Panorama’s leading survey administration and analytics platform to quickly gather and take action on information from students, families, teachers, and staff. The questions are applicable to all types of K-12 school settings and grade levels, as well as to communities serving students from a range of socioeconomic backgrounds.

back-to-school-students

In the Panorama platform, educators can view and disaggregate results by topic, question, demographic group, grade level, school, and more to inform priority areas and action plans. Districts may use the data to improve teaching and learning models, build stronger academic and social-emotional support systems, improve stakeholder communication, and inform staff professional development.

To learn more about Panorama's survey platform, get in touch with our team.

Related Articles

Engaging Your School Community in Survey Results (Q&A Ep. 4)

Engaging Your School Community in Survey Results (Q&A Ep. 4)

Learn how to engage principals, staff, families, and students in the survey results when running a stakeholder feedback program around school climate.

La Cañada Shares Survey Results

La Cañada Shares Survey Results

La Cañada Unified School District, Panorama's first client, shares results from its surveys, used to collect feedback from students, families, and staff.

44 Questions to Ask Students, Families, and Staff During the Pandemic

44 Questions to Ask Students, Families, and Staff During the Pandemic

Identify ways to support students, families, and staff in your school district during the pandemic with these 44 questions.

research questions online learning

Featured Resource

9 virtual learning resources to connect with students, families, and staff.

We've bundled our top resources for building belonging in hybrid or distance learning environments.

Join 90,000+ education leaders on our weekly newsletter.

  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

research questions online learning

Home Surveys Academic

Distance learning survey for students: Tips & examples

Distance learning survey questions for students

The COVID-19 pandemic changed learning in many unprecedented ways. Students had to not just move to online learning but also keep a social distance from their friends and family. A student interest survey helps customize teaching methods and curriculum to make learning more engaging and relevant to students’ lives. It was quite challenging for some to adjust to the ‘new normal’ and missed the in-person interaction with their teachers. For some, it simply meant spending more time with the parents.

Schools need to know how students feel about distance education and learn more about their experiences. To collect data, they can send out a survey on remote learning for students. Once they have the results, the management team can know what students like in the existing setup and what they would like to change.

The classroom response system allowed students to answer multiple-choice questions and engage in real-time discussions instantly.

Here are the examples of class survey questions of distance learning survey for students you must ask to collect their feedback.

LEARN ABOUT:  Testimonial Questions

Examples of distance learning survey questions for students

1. How do you feel overall about distance education?

  • Below Average

This question collects responses about the overall experience of the students regarding online education. Schools can use this data to decide whether they should continue with teaching online or move in-person learning.

2. Do you have access to a device for learning online?

  • Yes, but it doesn’t work well
  • No, I share with others

Students should have uninterrupted access to a device for learning online. Know if they face any challenges with the device’s hardware quality. Or if they share the device with others in the house and can’t access when they need it.

3. What device do you use for distance learning?

Know whether students use a laptop, desktop, smartphone, or tablet for distance learning. A laptop or desktop would be an ideal choice for its screen size and quality. You can use a multiple-choice question type in your questionnaire for distance education students.

4. How much time do you spend each day on an average on distance education?

Know how much time do students spend while taking an online course. Analyze if they are over-spending time and find out the reasons behind it. Students must allocate some time to play and exercise while staying at home to take care of their health. You can find out from answers to this question whether they spend time on other activities as well.

5. How effective has remote learning been for you?

  • Not at all effective
  • Slightly effective
  • Moderately effective
  • Very effective
  • Extremely effective

Depending on an individual’s personality, students may like to learn in the classroom with fellow students or alone at home. The classroom offers a more lively and interactive environment, whereas it is relatively calm at home. You can use this question to know if remote learning is working for students or not. 

6. How helpful your [School or University] has been in offering you the resources to learn from home?

  • Not at all helpful
  • Slightly helpful
  • Moderately helpful
  • Very helpful
  • Extremely helpful

The school management teams need to offer full support to both teachers and students to make distance education comfortable and effective. They should provide support in terms of technological infrastructure and process framework. Given the pandemic situation, schools must allow more flexibility and create lesser strict policies.

7. How stressful is distance learning for you during the COVID-19 pandemic?

Studying in the time of pandemic can be quite stressful, especially if you or someone in the family is not doing well. Measure the stress level of the students and identify ways to reduce it. For instance, you can organize an online dance party or a lego game. The responses to this question can be crucial in deciding the future course of distance learning. 

8. How well could you manage time while learning remotely? (Consider 5 being extremely well and 1 being not at all)

  • Academic schedule

Staying at home all the time and balancing multiple things can be stressful for many people. It requires students to have good time-management skills and self-discipline. Students can rate their experience on a scale of 1-5 and share it with the school authorities. Use a multiple-choice matrix question type for such questions in your distance learning questionnaire for students.

LEARN ABOUT: System Usability Scale

9. Do you enjoy learning remotely?

  • Yes, absolutely
  • Yes, but I would like to change a few things
  • No, there are quite a few challenges
  • No, not at all

Get a high-level view on whether students are enjoying learning from home or doing it because they are being forced to do so. Gain insights on how you can improve distance education and make it interesting for them.

10. How helpful are your teachers while studying online?

Distance education lacks proximity with teachers and has its own set of unique challenges. Some students may find it difficult to learn a subject and take more time to understand. This question measures the extent to which students find their teachers helpful.

You can also use a ready-made survey template to save time. The sample questionnaire for students can be easily customized as per your requirements.

USE THIS TEMPLATE

Other important questions of distance learning survey for students

  • How peaceful is the environment at home while learning?
  •  Are you satisfied with the technology and software you are using for online learning?
  • How important is face-to-face communication for you while learning remotely?
  • How often do you talk to your [School/University] classmates?
  • How often do you have a 1-1 discussion with your teachers?

How to create a survey?

The intent behind creating a remote learning questionnaire for students should be to know how schools and teachers can better support them. Use an online survey software like ours to create a survey or use a template to get started. Distribute the survey through email, mobile app, website, or QR code.

Once you get the survey results, generate reports, and share them with your team. You can also download them in formats like .pdf, .doc, and .xls. To analyze data from multiple resources, you can integrate the survey software with third-party apps.

If you need any help with designing a survey, customizing the look and feel, or deriving insights from it, get in touch with us. We’d be happy to help.

MORE LIKE THIS

Cannabis Industry Business Intelligence

Cannabis Industry Business Intelligence: Impact on Research

May 28, 2024

Best Dynata Alternatives

Top 10 Dynata Alternatives & Competitors

May 27, 2024

research questions online learning

What Are My Employees Really Thinking? The Power of Open-ended Survey Analysis

May 24, 2024

When I think of “disconnected”, it is important that this is not just in relation to people analytics, Employee Experience or Customer Experience - it is also relevant to looking across them.

I Am Disconnected – Tuesday CX Thoughts

May 21, 2024

Other categories

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence
  • Open supplemental data
  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, insights into students’ experiences and perceptions of remote learning methods: from the covid-19 pandemic to best practice for the future.

research questions online learning

  • 1 Minerva Schools at Keck Graduate Institute, San Francisco, CA, United States
  • 2 Ronin Institute for Independent Scholarship, Montclair, NJ, United States
  • 3 Department of Physics, University of Toronto, Toronto, ON, Canada

This spring, students across the globe transitioned from in-person classes to remote learning as a result of the COVID-19 pandemic. This unprecedented change to undergraduate education saw institutions adopting multiple online teaching modalities and instructional platforms. We sought to understand students’ experiences with and perspectives on those methods of remote instruction in order to inform pedagogical decisions during the current pandemic and in future development of online courses and virtual learning experiences. Our survey gathered quantitative and qualitative data regarding students’ experiences with synchronous and asynchronous methods of remote learning and specific pedagogical techniques associated with each. A total of 4,789 undergraduate participants representing institutions across 95 countries were recruited via Instagram. We find that most students prefer synchronous online classes, and students whose primary mode of remote instruction has been synchronous report being more engaged and motivated. Our qualitative data show that students miss the social aspects of learning on campus, and it is possible that synchronous learning helps to mitigate some feelings of isolation. Students whose synchronous classes include active-learning techniques (which are inherently more social) report significantly higher levels of engagement, motivation, enjoyment, and satisfaction with instruction. Respondents’ recommendations for changes emphasize increased engagement, interaction, and student participation. We conclude that active-learning methods, which are known to increase motivation, engagement, and learning in traditional classrooms, also have a positive impact in the remote-learning environment. Integrating these elements into online courses will improve the student experience.

Introduction

The COVID-19 pandemic has dramatically changed the demographics of online students. Previously, almost all students engaged in online learning elected the online format, starting with individual online courses in the mid-1990s through today’s robust online degree and certificate programs. These students prioritize convenience, flexibility and ability to work while studying and are older than traditional college age students ( Harris and Martin, 2012 ; Levitz, 2016 ). These students also find asynchronous elements of a course are more useful than synchronous elements ( Gillingham and Molinari, 2012 ). In contrast, students who chose to take courses in-person prioritize face-to-face instruction and connection with others and skew considerably younger ( Harris and Martin, 2012 ). This leaves open the question of whether students who prefer to learn in-person but are forced to learn remotely will prefer synchronous or asynchronous methods. One study of student preferences following a switch to remote learning during the COVID-19 pandemic indicates that students enjoy synchronous over asynchronous course elements and find them more effective ( Gillis and Krull, 2020 ). Now that millions of traditional in-person courses have transitioned online, our survey expands the data on student preferences and explores if those preferences align with pedagogical best practices.

An extensive body of research has explored what instructional methods improve student learning outcomes (Fink. 2013). Considerable evidence indicates that active-learning or student-centered approaches result in better learning outcomes than passive-learning or instructor-centered approaches, both in-person and online ( Freeman et al., 2014 ; Chen et al., 2018 ; Davis et al., 2018 ). Active-learning approaches include student activities or discussion in class, whereas passive-learning approaches emphasize extensive exposition by the instructor ( Freeman et al., 2014 ). Constructivist learning theories argue that students must be active participants in creating their own learning, and that listening to expert explanations is seldom sufficient to trigger the neurological changes necessary for learning ( Bostock, 1998 ; Zull, 2002 ). Some studies conclude that, while students learn more via active learning, they may report greater perceptions of their learning and greater enjoyment when passive approaches are used ( Deslauriers et al., 2019 ). We examine student perceptions of remote learning experiences in light of these previous findings.

In this study, we administered a survey focused on student perceptions of remote learning in late May 2020 through the social media account of @unjadedjade to a global population of English speaking undergraduate students representing institutions across 95 countries. We aim to explore how students were being taught, the relationship between pedagogical methods and student perceptions of their experience, and the reasons behind those perceptions. Here we present an initial analysis of the results and share our data set for further inquiry. We find that positive student perceptions correlate with synchronous courses that employ a variety of interactive pedagogical techniques, and that students overwhelmingly suggest behavioral and pedagogical changes that increase social engagement and interaction. We argue that these results support the importance of active learning in an online environment.

Materials and Methods

Participant pool.

Students were recruited through the Instagram account @unjadedjade. This social media platform, run by influencer Jade Bowler, focuses on education, effective study tips, ethical lifestyle, and promotes a positive mindset. For this reason, the audience is presumably academically inclined, and interested in self-improvement. The survey was posted to her account and received 10,563 responses within the first 36 h. Here we analyze the 4,789 of those responses that came from undergraduates. While we did not collect demographic or identifying information, we suspect that women are overrepresented in these data as followers of @unjadedjade are 80% women. A large minority of respondents were from the United Kingdom as Jade Bowler is a British influencer. Specifically, 43.3% of participants attend United Kingdom institutions, followed by 6.7% attending university in the Netherlands, 6.1% in Germany, 5.8% in the United States and 4.2% in Australia. Ninety additional countries are represented in these data (see Supplementary Figure 1 ).

Survey Design

The purpose of this survey is to learn about students’ instructional experiences following the transition to remote learning in the spring of 2020.

This survey was initially created for a student assignment for the undergraduate course Empirical Analysis at Minerva Schools at KGI. That version served as a robust pre-test and allowed for identification of the primary online platforms used, and the four primary modes of learning: synchronous (live) classes, recorded lectures and videos, uploaded or emailed materials, and chat-based communication. We did not adapt any open-ended questions based on the pre-test survey to avoid biasing the results and only corrected language in questions for clarity. We used these data along with an analysis of common practices in online learning to revise the survey. Our revised survey asked students to identify the synchronous and asynchronous pedagogical methods and platforms that they were using for remote learning. Pedagogical methods were drawn from literature assessing active and passive teaching strategies in North American institutions ( Fink, 2013 ; Chen et al., 2018 ; Davis et al., 2018 ). Open-ended questions asked students to describe why they preferred certain modes of learning and how they could improve their learning experience. Students also reported on their affective response to learning and participation using a Likert scale.

The revised survey also asked whether students had responded to the earlier survey. No significant differences were found between responses of those answering for the first and second times (data not shown). See Supplementary Appendix 1 for survey questions. Survey data was collected from 5/21/20 to 5/23/20.

Qualitative Coding

We applied a qualitative coding framework adapted from Gale et al. (2013) to analyze student responses to open-ended questions. Four researchers read several hundred responses and noted themes that surfaced. We then developed a list of themes inductively from the survey data and deductively from the literature on pedagogical practice ( Garrison et al., 1999 ; Zull, 2002 ; Fink, 2013 ; Freeman et al., 2014 ). The initial codebook was revised collaboratively based on feedback from researchers after coding 20–80 qualitative comments each. Before coding their assigned questions, alignment was examined through coding of 20 additional responses. Researchers aligned in identifying the same major themes. Discrepancies in terms identified were resolved through discussion. Researchers continued to meet weekly to discuss progress and alignment. The majority of responses were coded by a single researcher using the final codebook ( Supplementary Table 1 ). All responses to questions 3 (4,318 responses) and 8 (4,704 responses), and 2,512 of 4,776 responses to question 12 were analyzed. Valence was also indicated where necessary (i.e., positive or negative discussion of terms). This paper focuses on the most prevalent themes from our initial analysis of the qualitative responses. The corresponding author reviewed codes to ensure consistency and accuracy of reported data.

Statistical Analysis

The survey included two sets of Likert-scale questions, one consisting of a set of six statements about students’ perceptions of their experiences following the transition to remote learning ( Table 1 ). For each statement, students indicated their level of agreement with the statement on a five-point scale ranging from 1 (“Strongly Disagree”) to 5 (“Strongly Agree”). The second set asked the students to respond to the same set of statements, but about their retroactive perceptions of their experiences with in-person instruction before the transition to remote learning. This set was not the subject of our analysis but is present in the published survey results. To explore correlations among student responses, we used CrossCat analysis to calculate the probability of dependence between Likert-scale responses ( Mansinghka et al., 2016 ).

www.frontiersin.org

Table 1. Likert-scale questions.

Mean values are calculated based on the numerical scores associated with each response. Measures of statistical significance for comparisons between different subgroups of respondents were calculated using a two-sided Mann-Whitney U -test, and p -values reported here are based on this test statistic. We report effect sizes in pairwise comparisons using the common-language effect size, f , which is the probability that the response from a random sample from subgroup 1 is greater than the response from a random sample from subgroup 2. We also examined the effects of different modes of remote learning and technological platforms using ordinal logistic regression. With the exception of the mean values, all of these analyses treat Likert-scale responses as ordinal-scale, rather than interval-scale data.

Students Prefer Synchronous Class Sessions

Students were asked to identify their primary mode of learning given four categories of remote course design that emerged from the pilot survey and across literature on online teaching: live (synchronous) classes, recorded lectures and videos, emailed or uploaded materials, and chats and discussion forums. While 42.7% ( n = 2,045) students identified live classes as their primary mode of learning, 54.6% ( n = 2613) students preferred this mode ( Figure 1 ). Both recorded lectures and live classes were preferred over uploaded materials (6.22%, n = 298) and chat (3.36%, n = 161).

www.frontiersin.org

Figure 1. Actual (A) and preferred (B) primary modes of learning.

In addition to a preference for live classes, students whose primary mode was synchronous were more likely to enjoy the class, feel motivated and engaged, be satisfied with instruction and report higher levels of participation ( Table 2 and Supplementary Figure 2 ). Regardless of primary mode, over two-thirds of students reported they are often distracted during remote courses.

www.frontiersin.org

Table 2. The effect of synchronous vs. asynchronous primary modes of learning on student perceptions.

Variation in Pedagogical Techniques for Synchronous Classes Results in More Positive Perceptions of the Student Learning Experience

To survey the use of passive vs. active instructional methods, students reported the pedagogical techniques used in their live classes. Among the synchronous methods, we identify three different categories ( National Research Council, 2000 ; Freeman et al., 2014 ). Passive methods (P) include lectures, presentations, and explanation using diagrams, white boards and/or other media. These methods all rely on instructor delivery rather than student participation. Our next category represents active learning through primarily one-on-one interactions (A). The methods in this group are in-class assessment, question-and-answer (Q&A), and classroom chat. Group interactions (F) included classroom discussions and small-group activities. Given these categories, Mann-Whitney U pairwise comparisons between the 7 possible combinations and Likert scale responses about student experience showed that the use of a variety of methods resulted in higher ratings of experience vs. the use of a single method whether or not that single method was active or passive ( Table 3 ). Indeed, students whose classes used methods from each category (PAF) had higher ratings of enjoyment, motivation, and satisfaction with instruction than those who only chose any single method ( p < 0.0001) and also rated higher rates of participation and engagement compared to students whose only method was passive (P) or active through one-on-one interactions (A) ( p < 0.00001). Student ratings of distraction were not significantly different for any comparison. Given that sets of Likert responses often appeared significant together in these comparisons, we ran a CrossCat analysis to look at the probability of dependence across Likert responses. Responses have a high probability of dependence on each other, limiting what we can claim about any discrete response ( Supplementary Figure 3 ).

www.frontiersin.org

Table 3. Comparison of combinations of synchronous methods on student perceptions. Effect size (f).

Mann-Whitney U pairwise comparisons were also used to check if improvement in student experience was associated with the number of methods used vs. the variety of types of methods. For every comparison, we found that more methods resulted in higher scores on all Likert measures except distraction ( Table 4 ). Even comparison between four or fewer methods and greater than four methods resulted in a 59% chance that the latter enjoyed the courses more ( p < 0.00001) and 60% chance that they felt more motivated to learn ( p < 0.00001). Students who selected more than four methods ( n = 417) were also 65.1% ( p < 0.00001), 62.9% ( p < 0.00001) and 64.3% ( p < 0.00001) more satisfied with instruction, engaged, and actively participating, respectfully. Therefore, there was an overlap between how the number and variety of methods influenced students’ experiences. Since the number of techniques per category is 2–3, we cannot fully disentangle the effect of number vs. variety. Pairwise comparisons to look at subsets of data with 2–3 methods from a single group vs. 2–3 methods across groups controlled for this but had low sample numbers in most groups and resulted in no significant findings (data not shown). Therefore, from the data we have in our survey, there seems to be an interdependence between number and variety of methods on students’ learning experiences.

www.frontiersin.org

Table 4. Comparison of the number of synchronous methods on student perceptions. Effect size (f).

Variation in Asynchronous Pedagogical Techniques Results in More Positive Perceptions of the Student Learning Experience

Along with synchronous pedagogical methods, students reported the asynchronous methods that were used for their classes. We divided these methods into three main categories and conducted pairwise comparisons. Learning methods include video lectures, video content, and posted study materials. Interacting methods include discussion/chat forums, live office hours, and email Q&A with professors. Testing methods include assignments and exams. Our results again show the importance of variety in students’ perceptions ( Table 5 ). For example, compared to providing learning materials only, providing learning materials, interaction, and testing improved enjoyment ( f = 0.546, p < 0.001), motivation ( f = 0.553, p < 0.0001), satisfaction with instruction ( f = 0.596, p < 0.00001), engagement ( f = 0.572, p < 0.00001) and active participation ( f = 0.563, p < 0.00001) (row 6). Similarly, compared to just being interactive with conversations, the combination of all three methods improved five out of six indicators, except for distraction in class (row 11).

www.frontiersin.org

Table 5. Comparison of combinations of asynchronous methods on student perceptions. Effect size (f).

Ordinal logistic regression was used to assess the likelihood that the platforms students used predicted student perceptions ( Supplementary Table 2 ). Platform choices were based on the answers to open-ended questions in the pre-test survey. The synchronous and asynchronous methods used were consistently more predictive of Likert responses than the specific platforms. Likewise, distraction continued to be our outlier with no differences across methods or platforms.

Students Prefer In-Person and Synchronous Online Learning Largely Due to Social-Emotional Reasoning

As expected, 86.1% (4,123) of survey participants report a preference for in-person courses, while 13.9% (666) prefer online courses. When asked to explain the reasons for their preference, students who prefer in-person courses most often mention the importance of social interaction (693 mentions), engagement (639 mentions), and motivation (440 mentions). These students are also more likely to mention a preference for a fixed schedule (185 mentions) vs. a flexible schedule (2 mentions).

In addition to identifying social reasons for their preference for in-person learning, students’ suggestions for improvements in online learning focus primarily on increasing interaction and engagement, with 845 mentions of live classes, 685 mentions of interaction, 126 calls for increased participation and calls for changes related to these topics such as, “Smaller teaching groups for live sessions so that everyone is encouraged to talk as some people don’t say anything and don’t participate in group work,” and “Make it less of the professor reading the pdf that was given to us and more interaction.”

Students who prefer online learning primarily identify independence and flexibility (214 mentions) and reasons related to anxiety and discomfort in in-person settings (41 mentions). Anxiety was only mentioned 12 times in the much larger group that prefers in-person learning.

The preference for synchronous vs. asynchronous modes of learning follows similar trends ( Table 6 ). Students who prefer live classes mention engagement and interaction most often while those who prefer recorded lectures mention flexibility.

www.frontiersin.org

Table 6. Most prevalent themes for students based on their preferred mode of remote learning.

Student Perceptions Align With Research on Active Learning

The first, and most robust, conclusion is that incorporation of active-learning methods correlates with more positive student perceptions of affect and engagement. We can see this clearly in the substantial differences on a number of measures, where students whose classes used only passive-learning techniques reported lower levels of engagement, satisfaction, participation, and motivation when compared with students whose classes incorporated at least some active-learning elements. This result is consistent with prior research on the value of active learning ( Freeman et al., 2014 ).

Though research shows that student learning improves in active learning classes, on campus, student perceptions of their learning, enjoyment, and satisfaction with instruction are often lower in active-learning courses ( Deslauriers et al., 2019 ). Our finding that students rate enjoyment and satisfaction with instruction higher for active learning online suggests that the preference for passive lectures on campus relies on elements outside of the lecture itself. That might include the lecture hall environment, the social physical presence of peers, or normalization of passive lectures as the expected mode for on-campus classes. This implies that there may be more buy-in for active learning online vs. in-person.

A second result from our survey is that student perceptions of affect and engagement are associated with students experiencing a greater diversity of learning modalities. We see this in two different results. First, in addition to the fact that classes that include active learning outperform classes that rely solely on passive methods, we find that on all measures besides distraction, the highest student ratings are associated with a combination of active and passive methods. Second, we find that these higher scores are associated with classes that make use of a larger number of different methods.

This second result suggests that students benefit from classes that make use of multiple different techniques, possibly invoking a combination of passive and active methods. However, it is unclear from our data whether this effect is associated specifically with combining active and passive methods, or if it is associated simply with the use of multiple different methods, irrespective of whether those methods are active, passive, or some combination. The problem is that the number of methods used is confounded with the diversity of methods (e.g., it is impossible for a classroom using only one method to use both active and passive methods). In an attempt to address this question, we looked separately at the effect of number and diversity of methods while holding the other constant. Across a large number of such comparisons, we found few statistically significant differences, which may be a consequence of the fact that each comparison focused on a small subset of the data.

Thus, our data suggests that using a greater diversity of learning methods in the classroom may lead to better student outcomes. This is supported by research on student attention span which suggests varying delivery after 10–15 min to retain student’s attention ( Bradbury, 2016 ). It is likely that this is more relevant for online learning where students report high levels of distraction across methods, modalities, and platforms. Given that number and variety are key, and there are few passive learning methods, we can assume that some combination of methods that includes active learning improves student experience. However, it is not clear whether we should predict that this benefit would come simply from increasing the number of different methods used, or if there are benefits specific to combining particular methods. Disentangling these effects would be an interesting avenue for future research.

Students Value Social Presence in Remote Learning

Student responses across our open-ended survey questions show a striking difference in reasons for their preferences compared with traditional online learners who prefer flexibility ( Harris and Martin, 2012 ; Levitz, 2016 ). Students reasons for preferring in-person classes and synchronous remote classes emphasize the desire for social interaction and echo the research on the importance of social presence for learning in online courses.

Short et al. (1976) outlined Social Presence Theory in depicting students’ perceptions of each other as real in different means of telecommunications. These ideas translate directly to questions surrounding online education and pedagogy in regards to educational design in networked learning where connection across learners and instructors improves learning outcomes especially with “Human-Human interaction” ( Goodyear, 2002 , 2005 ; Tu, 2002 ). These ideas play heavily into asynchronous vs. synchronous learning, where Tu reports students having positive responses to both synchronous “real-time discussion in pleasantness, responsiveness and comfort with familiar topics” and real-time discussions edging out asynchronous computer-mediated communications in immediate replies and responsiveness. Tu’s research indicates that students perceive more interaction with synchronous mediums such as discussions because of immediacy which enhances social presence and support the use of active learning techniques ( Gunawardena, 1995 ; Tu, 2002 ). Thus, verbal immediacy and communities with face-to-face interactions, such as those in synchronous learning classrooms, lessen the psychological distance of communicators online and can simultaneously improve instructional satisfaction and reported learning ( Gunawardena and Zittle, 1997 ; Richardson and Swan, 2019 ; Shea et al., 2019 ). While synchronous learning may not be ideal for traditional online students and a subset of our participants, this research suggests that non-traditional online learners are more likely to appreciate the value of social presence.

Social presence also connects to the importance of social connections in learning. Too often, current systems of education emphasize course content in narrow ways that fail to embrace the full humanity of students and instructors ( Gay, 2000 ). With the COVID-19 pandemic leading to further social isolation for many students, the importance of social presence in courses, including live interactions that build social connections with classmates and with instructors, may be increased.

Limitations of These Data

Our undergraduate data consisted of 4,789 responses from 95 different countries, an unprecedented global scale for research on online learning. However, since respondents were followers of @unjadedjade who focuses on learning and wellness, these respondents may not represent the average student. Biases in survey responses are often limited by their recruitment techniques and our bias likely resulted in more robust and thoughtful responses to free-response questions and may have influenced the preference for synchronous classes. It is unlikely that it changed students reporting on remote learning pedagogical methods since those are out of student control.

Though we surveyed a global population, our design was rooted in literature assessing pedagogy in North American institutions. Therefore, our survey may not represent a global array of teaching practices.

This survey was sent out during the initial phase of emergency remote learning for most countries. This has two important implications. First, perceptions of remote learning may be clouded by complications of the pandemic which has increased social, mental, and financial stresses globally. Future research could disaggregate the impact of the pandemic from students’ learning experiences with a more detailed and holistic analysis of the impact of the pandemic on students.

Second, instructors, students and institutions were not able to fully prepare for effective remote education in terms of infrastructure, mentality, curriculum building, and pedagogy. Therefore, student experiences reflect this emergency transition. Single-modality courses may correlate with instructors who lacked the resources or time to learn or integrate more than one modality. Regardless, the main insights of this research align well with the science of teaching and learning and can be used to inform both education during future emergencies and course development for online programs that wish to attract traditional college students.

Global Student Voices Improve Our Understanding of the Experience of Emergency Remote Learning

Our survey shows that global student perspectives on remote learning agree with pedagogical best practices, breaking with the often-found negative reactions of students to these practices in traditional classrooms ( Shekhar et al., 2020 ). Our analysis of open-ended questions and preferences show that a majority of students prefer pedagogical approaches that promote both active learning and social interaction. These results can serve as a guide to instructors as they design online classes, especially for students whose first choice may be in-person learning. Indeed, with the near ubiquitous adoption of remote learning during the COVID-19 pandemic, remote learning may be the default for colleges during temporary emergencies. This has already been used at the K-12 level as snow days become virtual learning days ( Aspergren, 2020 ).

In addition to informing pedagogical decisions, the results of this survey can be used to inform future research. Although we survey a global population, our recruitment method selected for students who are English speakers, likely majority female, and have an interest in self-improvement. Repeating this study with a more diverse and representative sample of university students could improve the generalizability of our findings. While the use of a variety of pedagogical methods is better than a single method, more research is needed to determine what the optimal combinations and implementations are for courses in different disciplines. Though we identified social presence as the major trend in student responses, the over 12,000 open-ended responses from students could be analyzed in greater detail to gain a more nuanced understanding of student preferences and suggestions for improvement. Likewise, outliers could shed light on the diversity of student perspectives that we may encounter in our own classrooms. Beyond this, our findings can inform research that collects demographic data and/or measures learning outcomes to understand the impact of remote learning on different populations.

Importantly, this paper focuses on a subset of responses from the full data set which includes 10,563 students from secondary school, undergraduate, graduate, or professional school and additional questions about in-person learning. Our full data set is available here for anyone to download for continued exploration: https://dataverse.harvard.edu/dataset.xhtml?persistentId= doi: 10.7910/DVN/2TGOPH .

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

Ethical review and approval was not required for the study on human participants in accordance with the local legislation and institutional requirements. The patients/participants provided their written informed consent to participate in this study.

Author Contributions

GS: project lead, survey design, qualitative coding, writing, review, and editing. TN: data analysis, writing, review, and editing. CN and PB: qualitative coding. JW: data analysis, writing, and editing. CS: writing, review, and editing. EV and KL: original survey design and qualitative coding. PP: data analysis. JB: original survey design and survey distribution. HH: data analysis. MP: writing. All authors contributed to the article and approved the submitted version.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We want to thank Minerva Schools at KGI for providing funding for summer undergraduate research internships. We also want to thank Josh Fost and Christopher V. H.-H. Chen for discussion that helped shape this project.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2021.647986/full#supplementary-material

Aspergren, E. (2020). Snow Days Canceled Because of COVID-19 Online School? Not in These School Districts.sec. Education. USA Today. Available online at: https://www.usatoday.com/story/news/education/2020/12/15/covid-school-canceled-snow-day-online-learning/3905780001/ (accessed December 15, 2020).

Google Scholar

Bostock, S. J. (1998). Constructivism in mass higher education: a case study. Br. J. Educ. Technol. 29, 225–240. doi: 10.1111/1467-8535.00066

CrossRef Full Text | Google Scholar

Bradbury, N. A. (2016). Attention span during lectures: 8 seconds, 10 minutes, or more? Adv. Physiol. Educ. 40, 509–513. doi: 10.1152/advan.00109.2016

PubMed Abstract | CrossRef Full Text | Google Scholar

Chen, B., Bastedo, K., and Howard, W. (2018). Exploring best practices for online STEM courses: active learning, interaction & assessment design. Online Learn. 22, 59–75. doi: 10.24059/olj.v22i2.1369

Davis, D., Chen, G., Hauff, C., and Houben, G.-J. (2018). Activating learning at scale: a review of innovations in online learning strategies. Comput. Educ. 125, 327–344. doi: 10.1016/j.compedu.2018.05.019

Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., and Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proc. Natl. Acad. Sci. 116, 19251–19257. doi: 10.1073/pnas.1821936116

Fink, L. D. (2013). Creating Significant Learning Experiences: An Integrated Approach to Designing College Courses. Somerset, NJ: John Wiley & Sons, Incorporated.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., et al. (2014). Active learning increases student performance in science, engineering, and mathematics. Proc. Natl. Acad. Sci. 111, 8410–8415. doi: 10.1073/pnas.1319030111

Gale, N. K., Heath, G., Cameron, E., Rashid, S., and Redwood, S. (2013). Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med. Res. Methodol. 13:117. doi: 10.1186/1471-2288-13-117

Garrison, D. R., Anderson, T., and Archer, W. (1999). Critical inquiry in a text-based environment: computer conferencing in higher education. Internet High. Educ. 2, 87–105. doi: 10.1016/S1096-7516(00)00016-6

Gay, G. (2000). Culturally Responsive Teaching: Theory, Research, and Practice. Multicultural Education Series. New York, NY: Teachers College Press.

Gillingham, and Molinari, C. (2012). Online courses: student preferences survey. Internet Learn. 1, 36–45. doi: 10.18278/il.1.1.4

Gillis, A., and Krull, L. M. (2020). COVID-19 remote learning transition in spring 2020: class structures, student perceptions, and inequality in college courses. Teach. Sociol. 48, 283–299. doi: 10.1177/0092055X20954263

Goodyear, P. (2002). “Psychological foundations for networked learning,” in Networked Learning: Perspectives and Issues. Computer Supported Cooperative Work , eds C. Steeples and C. Jones (London: Springer), 49–75. doi: 10.1007/978-1-4471-0181-9_4

Goodyear, P. (2005). Educational design and networked learning: patterns, pattern languages and design practice. Australas. J. Educ. Technol. 21, 82–101. doi: 10.14742/ajet.1344

Gunawardena, C. N. (1995). Social presence theory and implications for interaction and collaborative learning in computer conferences. Int. J. Educ. Telecommun. 1, 147–166.

Gunawardena, C. N., and Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer mediated conferencing environment. Am. J. Distance Educ. 11, 8–26. doi: 10.1080/08923649709526970

Harris, H. S., and Martin, E. (2012). Student motivations for choosing online classes. Int. J. Scholarsh. Teach. Learn. 6, 1–8. doi: 10.20429/ijsotl.2012.060211

Levitz, R. N. (2016). 2015-16 National Online Learners Satisfaction and Priorities Report. Cedar Rapids: Ruffalo Noel Levitz, 12.

Mansinghka, V., Shafto, P., Jonas, E., Petschulat, C., Gasner, M., and Tenenbaum, J. B. (2016). CrossCat: a fully Bayesian nonparametric method for analyzing heterogeneous, high dimensional data. J. Mach. Learn. Res. 17, 1–49. doi: 10.1007/978-0-387-69765-9_7

National Research Council (2000). How People Learn: Brain, Mind, Experience, and School: Expanded Edition. Washington, DC: National Academies Press, doi: 10.17226/9853

Richardson, J. C., and Swan, K. (2019). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Online Learn. 7, 68–88. doi: 10.24059/olj.v7i1.1864

Shea, P., Pickett, A. M., and Pelz, W. E. (2019). A Follow-up investigation of ‘teaching presence’ in the suny learning network. Online Learn. 7, 73–75. doi: 10.24059/olj.v7i2.1856

Shekhar, P., Borrego, M., DeMonbrun, M., Finelli, C., Crockett, C., and Nguyen, K. (2020). Negative student response to active learning in STEM classrooms: a systematic review of underlying reasons. J. Coll. Sci. Teach. 49, 45–54.

Short, J., Williams, E., and Christie, B. (1976). The Social Psychology of Telecommunications. London: John Wiley & Sons.

Tu, C.-H. (2002). The measurement of social presence in an online learning environment. Int. J. E Learn. 1, 34–45. doi: 10.17471/2499-4324/421

Zull, J. E. (2002). The Art of Changing the Brain: Enriching Teaching by Exploring the Biology of Learning , 1st Edn. Sterling, VA: Stylus Publishing.

Keywords : online learning, COVID-19, active learning, higher education, pedagogy, survey, international

Citation: Nguyen T, Netto CLM, Wilkins JF, Bröker P, Vargas EE, Sealfon CD, Puthipiroj P, Li KS, Bowler JE, Hinson HR, Pujar M and Stein GM (2021) Insights Into Students’ Experiences and Perceptions of Remote Learning Methods: From the COVID-19 Pandemic to Best Practice for the Future. Front. Educ. 6:647986. doi: 10.3389/feduc.2021.647986

Received: 30 December 2020; Accepted: 09 March 2021; Published: 09 April 2021.

Reviewed by:

Copyright © 2021 Nguyen, Netto, Wilkins, Bröker, Vargas, Sealfon, Puthipiroj, Li, Bowler, Hinson, Pujar and Stein. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Geneva M. Stein, [email protected]

This article is part of the Research Topic

Covid-19 and Beyond: From (Forced) Remote Teaching and Learning to ‘The New Normal’ in Higher Education

research questions online learning

A Systematic Review of the Research Topics in Online Learning During COVID-19: Documenting the Sudden Shift

  • Min Young Doo Kangwon National University http://orcid.org/0000-0003-3565-2159
  • Meina Zhu Wayne State University
  • Curtis J. Bonk Indiana University Bloomington

Since most schools and learners had no choice but to learn online during the pandemic, online learning became the mainstream learning mode rather than a substitute for traditional face-to-face learning. Given this enormous change in online learning, we conducted a systematic review of 191 of the most recent online learning studies published during the COVID-19 era. The systematic review results indicated that the themes regarding “courses and instructors” became popular during the pandemic, whereas most online learning research has focused on “learners” pre-COVID-19. Notably, the research topics “course and instructors” and “course technology” received more attention than prior to COVID-19. We found that “engagement” remained the most common research theme even after the pandemic. New research topics included parents, technology acceptance or adoption of online learning, and learners’ and instructors’ perceptions of online learning.

An, H., Mongillo, G., Sung, W., & Fuentes, D. (2022). Factors affecting online learning during the COVID-19 pandemic: The lived experiences of parents, teachers, and administrators in U.S. high-needs K-12 schools. The Journal of Online Learning Research (JOLR), 8(2), 203-234. https://www.learntechlib.org/primary/p/220404/

Aslan, S., Li, Q., Bonk, C. J., & Nachman, L. (2022). An overnight educational transformation: How did the pandemic turn early childhood education upside down? Online Learning, 26(2), 52-77. DOI: http://dx.doi.org/10.24059/olj.v26i2.2748

Azizan, S. N., Lee, A. S. H., Crosling, G., Atherton, G., Arulanandam, B. V., Lee, C. E., &

Abdul Rahim, R. B. (2022). Online learning and COVID-19 in higher education: The value of IT models in assessing students’ satisfaction. International Journal of Emerging Technologies in Learning (iJET), 17(3), 245–278. https://doi.org/10.3991/ijet.v17i03.24871

Beatty, B. J. (2019). Hybrid-flexible course design (1st ed.). EdTech Books. https://edtechbooks.org/hyflex

Berge, Z., & Mrozowski, S. (2001). Review of research in distance education, 1990 to 1999. American Journal of Distance Education, 15(3), 5–19. https://doi.org/ 10.1080/08923640109527090

Bond, M. (2020). Schools and emergency remote education during the COVID-19 pandemic: A living rapid systematic review. Asian Journal of Distance Education, 15(2), 191-247. http://www.asianjde.com/ojs/index.php/AsianJDE/article/view/517

Bond, M., Bedenlier, S., Marín, V. I., & Händel, M. (2021). Emergency remote teaching in higher education: Mapping the first global online semester. International Journal of Educational Technology in Higher Education, 18(1), 1-24. https://doi.org/10.1186/s41239-021-00282-x

Bonk, C. J. (2020). Pandemic ponderings, 30 years to today: Synchronous signals, saviors, or survivors? Distance Education, 41(4), 589-599. https://doi.org/10.1080/01587919.2020.1821610

Bonk, C. J., & Graham, C. R. (Eds.) (2006). Handbook of blended learning: Global perspectives, local designs. Pfeiffer Publishing.

Bonk, C. J., Olson, T., Wisher, R. A., & Orvis, K. L. (2002). Learning from focus groups: An examination of blended learning. Journal of Distance Education, 17(3), 97-118.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa

Braun, V., Clarke, V., & Rance, N. (2014). How to use thematic analysis with interview data. In A. Vossler & N. Moller (Eds.), The counselling & psychotherapy research handbook, 183–197. Sage.

Canales-Romero, D., & Hachfeld, A (2021). Juggling school and work from home: Results from a survey on German families with school-aged children during the early COVID-19 lockdown. Frontiers in Psychology, 12. https://doi.org/10.3389/fpsyg.2021.734257

Cao, Y., Zhang, S., Chan, M.C.E., Kang. Y. (2021). Post-pandemic reflections: lessons from Chinese mathematics teachers about online mathematics instruction. Asia Pacific Education Review, 22, 157–168. https://doi.org/10.1007/s12564-021-09694-w

The Centers for Disease Control and Prevention (2022, May 4). COVID-19 forecasts: Deaths. Retrieved from https://www.cdc.gov/coronavirus/2019-ncov/science/forecasting/forecasting-us.html

Chang, H. M, & Kim. H. J. (2021). Predicting the pass probability of secondary school students taking online classes. Computers & Education, 164, 104110. https://doi.org/10.1016/j.compedu.2020.104110

Charumilind, S. Craven, M., Lamb, J., Sabow, A., Singhal, S., & Wilson, M. (2022, March 1). When will the COVID-19 pandemic end? McKinsey & Company. https://www.mckinsey.com/industries/healthcare-systems-and-services/our-insights/when-will-the-covid-19-pandemic-end

Cooper, H. (1988). The structure of knowledge synthesis: A taxonomy of literature reviews. Knowledge in Society, 1, 104–126.

Crompton, H., Burke, D., Jordan, K., & Wilson, S. W. (2021). Learning with technology during emergencies: A systematic review of K‐12 education. British Journal of Educational Technology, 52(4), 1554-1575. https://doi.org/10.1111/bjet.13114

Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319-340. https://doi.org/10.2307/249008

Erwin, B. (2021, November). A policymaker’s guide to virtual schools. Education Commission of the States. https://www.ecs.org/wp-content/uploads/Policymakers-Guide-to-Virtual-Schools.pdf

Gross, B. (2021). Surging enrollment in virtual schools during the pandemic spurs new questions for policymakers. Center on Reinventing Public Education, Arizona State University. https://crpe.org/surging-enrollment-in-virtual-schools-during-the-pandemic-spurs-new-questions-for-policymakers/

Hamaidi, D. D. A., Arouri, D. Y. M., Noufal, R. K., & Aldrou, I. T. (2021). Parents’ perceptions of their children’s experiences with distance learning during the COVID-19 pandemic. The International Review of Research in Open and Distributed Learning, 22(2), 224-241. https://doi.org/10.19173/irrodl.v22i2.5154

Heo, H., Bonk, C. J., & Doo, M. Y. (2022). Influences of depression, self-efficacy, and resource management on learning engagement in blended learning during COVID-19. The Internet and Higher Education, 54, https://doi.org/10.1016/j.iheduc.2022.100856

Hodges, C., Moore, S., Lockee, B., Trust, T., & Bond, A. (2020, March 27). The differences between emergency remote teaching and online learning. EDUCAUSE Review. https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teachingand-online-learning

Huang, L. & Zhang, T. (2021). Perceived social support, psychological capital, and subjective well-being among college students in the context of online learning during the COVID-19 pandemic. Asia-Pacific Education Researcher. https://doi.org/10.1007/s40299-021-00608-3

Kanwar, A., & Daniel, J. (2020). Report to Commonwealth education ministers: From response to resilience. Commonwealth of Learning. http://oasis.col.org/handle/11599/3592

Lederman, D. (2019). Online enrollments grow, but pace slows. Inside Higher Ed. https://www.insidehighered.com/digital-learning/article/2019/12/11/more-students-study-online-rate-growth-slowed-2018

Lee, K. (2019). Rewriting a history of open universities: (Hi)stories of distance teachers. The International Review of Research in Open and Distributed Learning, 20(4), 1-12. https://doi.org/10.19173/irrodl.v20i3.4070

Liu, Y., & Butzlaff, A. (2021). Where's the germs? The effects of using virtual reality on nursing students' hospital infection prevention during the COVID-19 pandemic. Journal of Computer Assisted Learning, 37(6), 1622–1628. https://doi.org/10.1111/jcal.12601

Maloney, E. J., & Kim, J. (2020, June 10). Learning in 2050. Inside Higher Ed. https://www.insidehighered.com/digital-learning/blogs/learning-innovation/learning-2050

Martin, F., Sun, T., & Westine, C. D. (2020). A systematic review of research on online teaching and learning from 2009 to 2018. Computers & Education, 159, 104009.

Miks, J., & McIlwaine, J. (2020, April 20). Keeping the world’s children learning through COVID-19. UNICEF. https://www.unicef.org/coronavirus/keeping-worlds-children-learning-through-covid-19

Mishra, S., Sahoo, S., & Pandey, S. (2021). Research trends in online distance learning during the COVID-19 pandemic. Distance Education, 42(4), 494-519. https://doi.org/10.1080/01587919.2021.1986373

Moore, M. G. (Ed.) (2007). The handbook of distance education (2nd Ed.). Lawrence Erlbaum Associates.

Moore, M. G., & Kearsley, G. (2012). Distance education: A systems view (3rd ed.). Wadsworth.

Munir, F., Anwar, A., & Kee, D. M. H. (2021). The online learning and students’ fear of COVID-19: Study in Malaysia and Pakistan. The International Review of Research in Open and Distributed Learning, 22(4), 1-21. https://doi.org/10.19173/irrodl.v22i4.5637

National Center for Education Statistics (2015). Number of virtual schools by state and school type, magnet status, charter status, and shared-time status: School year 2013–14. https://nces.ed.gov/ccd/tables/201314_Virtual_Schools_table_1.asp

National Center for Education Statistics (2020). Number of virtual schools by state and school type, magnet status, charter status, and shared-time status: School year 2018–19. https://nces.ed.gov/ccd/tables/201819_Virtual_Schools_table_1.asp

National Center for Education Statistics (2021). Number of virtual schools by state and school type, magnet status, charter status, and shared-time status: School year 2019–20. https://nces.ed.gov/ccd/tables/201920_Virtual_Schools_table_1.asp

Nguyen T., Netto, C.L.M., Wilkins, J.F., Bröker, P., Vargas, E.E., Sealfon, C.D., Puthipiroj, P., Li, K.S., Bowler, J.E., Hinson, H.R., Pujar, M. & Stein, G.M. (2021). Insights into students’ experiences and perceptions of remote learning methods: From the COVID-19 pandemic to best practice for the future. Frontiers in Education, 6, 647986. doi: 10.3389/feduc.2021.647986

Oinas, S., Hotulainen, R., Koivuhovi, S., Brunila, K., & Vainikainen, M-P. (2022). Remote learning experiences of girls, boys and non-binary students. Computers & Education, 183, [104499]. https://doi.org/10.1016/j.compedu.2022.104499

Park, A. (2022, April 29). The U.S. is in a 'Controlled Pandemic' Phase of COVID-19. But what does that mean? Time. https://time.com/6172048/covid-19-controlled-pandemic-endemic/

Petersen, G. B. L., Petkakis, G., & Makransky, G. (2022). A study of how immersion and interactivity drive VR learning. Computers & Education, 179, 104429, https://doi.org/10.1016/j.compedu.2021.104429

Picciano, A., Dziuban, C., & Graham, C. R. (Eds.) (2014). Blended learning: Research perspectives, Volume 2. Routledge.

Picciano, A., Dziuban, C., Graham, C. R. & Moskal, P. (Eds.) (2022). Blended learning: Research perspectives, Volume 3. Routledge.

Pollard, R., & Kumar, S. (2021). Mentoring graduate students online: Strategies and challenges. The International Review of Research in Open and Distributed Learning, 22(2), 267-284. https://doi.org/10.19173/irrodl.v22i2.5093

Salis-Pilco, S. Z., Yang. Y., Zhang. Z. (2022). Student engagement in online learning in Latin American higher education during the COVID-19 pandemic: A systematic review. British Journal of Educational Technology, 53(3), 593-619. https://doi.org/10.1111/bjet.13190

Shen, Y. W., Reynolds, T. H., Bonk, C. J., & Brush, T. A. (2013). A case study of applying blended learning in an accelerated post-baccalaureate teacher education program. Journal of Educational Technology Development and Exchange, 6(1), 59-78.

Seabra, F., Teixeira, A., Abelha, M., Aires, L. (2021). Emergency remote teaching and learning in Portugal: Preschool to secondary school teachers’ perceptions. Education Sciences, 11, 349. https://doi.org/ 10.3390/educsci11070349

Tallent-Runnels, M. K., Thomas, J. A., Lan, W. Y., Cooper, S., Ahern, T. C., Shaw, S. M., & Liu, X. (2006). Teaching courses online: A review of the research. Review of Educational Research, 76(1), 93–135. https://doi.org/10.3102/00346543076001093 .

Theirworld. (2020, March 20). Hundreds of millions of students now learning from home after coronavirus crisis shuts their schools. ReliefWeb. https://reliefweb.int/report/world/hundreds-millions-students-now-learning-home-after-coronavirus-crisis-shuts-their

UNESCO (2020). UNESCO rallies international organizations, civil society and private sector partners in a broad Coalition to ensure #LearningNeverStops. https://en.unesco.org/news/unesco-rallies-international-organizations-civil-society-and-private-sector-partners-broad

VanLeeuwen, C. A., Veletsianos, G., Johnson, N., & Belikov, O. (2021). Never-ending repetitiveness, sadness, loss, and “juggling with a blindfold on:” Lived experiences of Canadian college and university faculty members during the COVID-19

pandemic. British Journal of Educational Technology, 52, 1306-1322

https://doi.org/10.1111/bjet.13065

Wedemeyer, C. A. (1981). Learning at the back door: Reflections on non-traditional learning in the lifespan. University of Wisconsin Press.

Zawacki-Richter, O., Backer, E., & Vogt, S. (2009). Review of distance education research (2000 to 2008): Analysis of research areas, methods, and authorship patterns. International Review of Research in Open and Distance Learning, 10(6), 30. https://doi.org/10.19173/irrodl.v10i6.741

Zhan, Z., Li, Y., Yuan, X., & Chen, Q. (2021). To be or not to be: Parents’ willingness to send their children back to school after the COVID-19 outbreak. The Asia-Pacific Education Researcher. https://doi.org/10.1007/s40299-021-00610-9

As a condition of publication, the author agrees to apply the Creative Commons – Attribution International 4.0 (CC-BY) License to OLJ articles. See: https://creativecommons.org/licenses/by/4.0/ .

This licence allows anyone to reproduce OLJ articles at no cost and without further permission as long as they attribute the author and the journal. This permission includes printing, sharing and other forms of distribution.

Author(s) hold copyright in their work, and retain publishing rights without restrictions

Scopus Citescore 2022

The DOAJ Seal is awarded to journals that demonstrate best practice in open access publishing

OLC Membership

Join the OLC

OLC Research Center

research questions online learning

Information

  • For Readers
  • For Authors
  • For Librarians

More information about the publishing system, Platform and Workflow by OJS/PKP.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

What we know about online learning and the homework gap amid the pandemic

A sixth grader completes his homework online in his family's living room in Boston on March 31, 2020.

America’s K-12 students are returning to classrooms this fall after 18 months of virtual learning at home during the COVID-19 pandemic. Some students who lacked the home internet connectivity needed to finish schoolwork during this time – an experience often called the “ homework gap ” – may continue to feel the effects this school year.

Here is what Pew Research Center surveys found about the students most likely to be affected by the homework gap and their experiences learning from home.

Children across the United States are returning to physical classrooms this fall after 18 months at home, raising questions about how digital disparities at home will affect the existing homework gap between certain groups of students.

Methodology for each Pew Research Center poll can be found at the links in the post.

With the exception of the 2018 survey, everyone who took part in the surveys is a member of the Center’s American Trends Panel (ATP), an online survey panel that is recruited through national, random sampling of residential addresses. This way nearly all U.S. adults have a chance of selection. The survey is weighted to be representative of the U.S. adult population by gender, race, ethnicity, partisan affiliation, education and other categories. Read more about the  ATP’s methodology .

The 2018 data on U.S. teens comes from a Center poll of 743 U.S. teens ages 13 to 17 conducted March 7 to April 10, 2018, using the NORC AmeriSpeak panel. AmeriSpeak is a nationally representative, probability-based panel of the U.S. household population. Randomly selected U.S. households are sampled with a known, nonzero probability of selection from the NORC National Frame, and then contacted by U.S. mail, telephone or face-to-face interviewers. Read more details about the NORC AmeriSpeak panel methodology .

Around nine-in-ten U.S. parents with K-12 children at home (93%) said their children have had some online instruction since the coronavirus outbreak began in February 2020, and 30% of these parents said it has been very or somewhat difficult for them to help their children use technology or the internet as an educational tool, according to an April 2021 Pew Research Center survey .

A bar chart showing that mothers and parents with lower incomes are more likely than fathers and those with higher incomes to have trouble helping their children with tech for online learning

Gaps existed for certain groups of parents. For example, parents with lower and middle incomes (36% and 29%, respectively) were more likely to report that this was very or somewhat difficult, compared with just 18% of parents with higher incomes.

This challenge was also prevalent for parents in certain types of communities – 39% of rural residents and 33% of urban residents said they have had at least some difficulty, compared with 23% of suburban residents.

Around a third of parents with children whose schools were closed during the pandemic (34%) said that their child encountered at least one technology-related obstacle to completing their schoolwork during that time. In the April 2021 survey, the Center asked parents of K-12 children whose schools had closed at some point about whether their children had faced three technology-related obstacles. Around a quarter of parents (27%) said their children had to do schoolwork on a cellphone, 16% said their child was unable to complete schoolwork because of a lack of computer access at home, and another 14% said their child had to use public Wi-Fi to finish schoolwork because there was no reliable connection at home.

Parents with lower incomes whose children’s schools closed amid COVID-19 were more likely to say their children faced technology-related obstacles while learning from home. Nearly half of these parents (46%) said their child faced at least one of the three obstacles to learning asked about in the survey, compared with 31% of parents with midrange incomes and 18% of parents with higher incomes.

A chart showing that parents with lower incomes are more likely than parents with higher incomes to say their children have faced tech-related schoolwork challenges in the pandemic

Of the three obstacles asked about in the survey, parents with lower incomes were most likely to say that their child had to do their schoolwork on a cellphone (37%). About a quarter said their child was unable to complete their schoolwork because they did not have computer access at home (25%), or that they had to use public Wi-Fi because they did not have a reliable internet connection at home (23%).

A Center survey conducted in April 2020 found that, at that time, 59% of parents with lower incomes who had children engaged in remote learning said their children would likely face at least one of the obstacles asked about in the 2021 survey.

A year into the outbreak, an increasing share of U.S. adults said that K-12 schools have a responsibility to provide all students with laptop or tablet computers in order to help them complete their schoolwork at home during the pandemic. About half of all adults (49%) said this in the spring 2021 survey, up 12 percentage points from a year earlier. An additional 37% of adults said that schools should provide these resources only to students whose families cannot afford them, and just 13% said schools do not have this responsibility.

A bar chart showing that roughly half of adults say schools have responsibility to provide technology to all students during pandemic

While larger shares of both political parties in April 2021 said K-12 schools have a responsibility to provide computers to all students in order to help them complete schoolwork at home, there was a 15-point change among Republicans: 43% of Republicans and those who lean to the Republican Party said K-12 schools have this responsibility, compared with 28% last April. In the 2021 survey, 22% of Republicans also said schools do not have this responsibility at all, compared with 6% of Democrats and Democratic leaners.

Even before the pandemic, Black teens and those living in lower-income households were more likely than other groups to report trouble completing homework assignments because they did not have reliable technology access. Nearly one-in-five teens ages 13 to 17 (17%) said they are often or sometimes unable to complete homework assignments because they do not have reliable access to a computer or internet connection, a 2018 Center survey of U.S. teens found.

A bar chart showing that in 2018, Black teens and those from lower-income households were especially likely to be impacted by the digital 'homework gap'

One-quarter of Black teens said they were at least sometimes unable to complete their homework due to a lack of digital access, including 13% who said this happened to them often. Just 4% of White teens and 6% of Hispanic teens said this often happened to them. (There were not enough Asian respondents in the survey sample to be broken out into a separate analysis.)

A wide gap also existed by income level: 24% of teens whose annual family income was less than $30,000 said the lack of a dependable computer or internet connection often or sometimes prohibited them from finishing their homework, but that share dropped to 9% among teens who lived in households earning $75,000 or more a year.

  • Coronavirus (COVID-19)
  • COVID-19 & Technology
  • Digital Divide
  • Education & Learning Online

Download Katherine Schaeffer's photo

Katherine Schaeffer is a research analyst at Pew Research Center .

How Americans View the Coronavirus, COVID-19 Vaccines Amid Declining Levels of Concern

Online religious services appeal to many americans, but going in person remains more popular, about a third of u.s. workers who can work from home now do so all the time, how the pandemic has affected attendance at u.s. religious services, mental health and the pandemic: what u.s. surveys have found, most popular.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

Copyright 2024 Pew Research Center

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Wiley - PMC COVID-19 Collection

Logo of pheblackwell

Online and face‐to‐face learning: Evidence from students’ performance during the Covid‐19 pandemic

Carolyn chisadza.

1 Department of Economics, University of Pretoria, Hatfield South Africa

Matthew Clance

Thulani mthembu.

2 Department of Education Innovation, University of Pretoria, Hatfield South Africa

Nicky Nicholls

Eleni yitbarek.

This study investigates the factors that predict students' performance after transitioning from face‐to‐face to online learning as a result of the Covid‐19 pandemic. It uses students' responses from survey questions and the difference in the average assessment grades between pre‐lockdown and post‐lockdown at a South African university. We find that students' performance was positively associated with good wifi access, relative to using mobile internet data. We also observe lower academic performance for students who found transitioning to online difficult and who expressed a preference for self‐study (i.e. reading through class slides and notes) over assisted study (i.e. joining live lectures or watching recorded lectures). The findings suggest that improving digital infrastructure and reducing the cost of internet access may be necessary for mitigating the impact of the Covid‐19 pandemic on education outcomes.

1. INTRODUCTION

The Covid‐19 pandemic has been a wake‐up call to many countries regarding their capacity to cater for mass online education. This situation has been further complicated in developing countries, such as South Africa, who lack the digital infrastructure for the majority of the population. The extended lockdown in South Africa saw most of the universities with mainly in‐person teaching scrambling to source hardware (e.g. laptops, internet access), software (e.g. Microsoft packages, data analysis packages) and internet data for disadvantaged students in order for the semester to recommence. Not only has the pandemic revealed the already stark inequality within the tertiary student population, but it has also revealed that high internet data costs in South Africa may perpetuate this inequality, making online education relatively inaccessible for disadvantaged students. 1

The lockdown in South Africa made it possible to investigate the changes in second‐year students' performance in the Economics department at the University of Pretoria. In particular, we are interested in assessing what factors predict changes in students' performance after transitioning from face‐to‐face (F2F) to online learning. Our main objectives in answering this study question are to establish what study materials the students were able to access (i.e. slides, recordings, or live sessions) and how students got access to these materials (i.e. the infrastructure they used).

The benefits of education on economic development are well established in the literature (Gyimah‐Brempong,  2011 ), ranging from health awareness (Glick et al.,  2009 ), improved technological innovations, to increased capacity development and employment opportunities for the youth (Anyanwu,  2013 ; Emediegwu,  2021 ). One of the ways in which inequality is perpetuated in South Africa, and Africa as a whole, is through access to education (Anyanwu,  2016 ; Coetzee,  2014 ; Tchamyou et al.,  2019 ); therefore, understanding the obstacles that students face in transitioning to online learning can be helpful in ensuring more equal access to education.

Using students' responses from survey questions and the difference in the average grades between pre‐lockdown and post‐lockdown, our findings indicate that students' performance in the online setting was positively associated with better internet access. Accessing assisted study material, such as narrated slides or recordings of the online lectures, also helped students. We also find lower academic performance for students who reported finding transitioning to online difficult and for those who expressed a preference for self‐study (i.e. reading through class slides and notes) over assisted study (i.e. joining live lectures or watching recorded lectures). The average grades between pre‐lockdown and post‐lockdown were about two points and three points lower for those who reported transitioning to online teaching difficult and for those who indicated a preference for self‐study, respectively. The findings suggest that improving the quality of internet infrastructure and providing assisted learning can be beneficial in reducing the adverse effects of the Covid‐19 pandemic on learning outcomes.

Our study contributes to the literature by examining the changes in the online (post‐lockdown) performance of students and their F2F (pre‐lockdown) performance. This approach differs from previous studies that, in most cases, use between‐subject designs where one group of students following online learning is compared to a different group of students attending F2F lectures (Almatra et al.,  2015 ; Brown & Liedholm,  2002 ). This approach has a limitation in that that there may be unobserved characteristics unique to students choosing online learning that differ from those choosing F2F lectures. Our approach avoids this issue because we use a within‐subject design: we compare the performance of the same students who followed F2F learning Before lockdown and moved to online learning during lockdown due to the Covid‐19 pandemic. Moreover, the study contributes to the limited literature that compares F2F and online learning in developing countries.

Several studies that have also compared the effectiveness of online learning and F2F classes encounter methodological weaknesses, such as small samples, not controlling for demographic characteristics, and substantial differences in course materials and assessments between online and F2F contexts. To address these shortcomings, our study is based on a relatively large sample of students and includes demographic characteristics such as age, gender and perceived family income classification. The lecturer and course materials also remained similar in the online and F2F contexts. A significant proportion of our students indicated that they never had online learning experience before. Less than 20% of the students in the sample had previous experience with online learning. This highlights the fact that online education is still relatively new to most students in our sample.

Given the global experience of the fourth industrial revolution (4IR), 2 with rapidly accelerating technological progress, South Africa needs to be prepared for the possibility of online learning becoming the new norm in the education system. To this end, policymakers may consider engaging with various organizations (schools, universities, colleges, private sector, and research facilities) To adopt interventions that may facilitate the transition to online learning, while at the same time ensuring fair access to education for all students across different income levels. 3

1.1. Related literature

Online learning is a form of distance education which mainly involves internet‐based education where courses are offered synchronously (i.e. live sessions online) and/or asynchronously (i.e. students access course materials online in their own time, which is associated with the more traditional distance education). On the other hand, traditional F2F learning is real time or synchronous learning. In a physical classroom, instructors engage with the students in real time, while in the online format instructors can offer real time lectures through learning management systems (e.g. Blackboard Collaborate), or record the lectures for the students to watch later. Purely online courses are offered entirely over the internet, while blended learning combines traditional F2F classes with learning over the internet, and learning supported by other technologies (Nguyen,  2015 ).

Moreover, designing online courses requires several considerations. For example, the quality of the learning environment, the ease of using the learning platform, the learning outcomes to be achieved, instructor support to assist and motivate students to engage with the course material, peer interaction, class participation, type of assessments (Paechter & Maier,  2010 ), not to mention training of the instructor in adopting and introducing new teaching methods online (Lundberg et al.,  2008 ). In online learning, instructors are more facilitators of learning. On the other hand, traditional F2F classes are structured in such a way that the instructor delivers knowledge, is better able to gauge understanding and interest of students, can engage in class activities, and can provide immediate feedback on clarifying questions during the class. Additionally, the designing of traditional F2F courses can be less time consuming for instructors compared to online courses (Navarro,  2000 ).

Online learning is also particularly suited for nontraditional students who require flexibility due to work or family commitments that are not usually associated with the undergraduate student population (Arias et al.,  2018 ). Initially the nontraditional student belonged to the older adult age group, but with blended learning becoming more commonplace in high schools, colleges and universities, online learning has begun to traverse a wider range of age groups. However, traditional F2F classes are still more beneficial for learners that are not so self‐sufficient and lack discipline in working through the class material in the required time frame (Arias et al.,  2018 ).

For the purpose of this literature review, both pure online and blended learning are considered to be online learning because much of the evidence in the literature compares these two types against the traditional F2F learning. The debate in the literature surrounding online learning versus F2F teaching continues to be a contentious one. A review of the literature reveals mixed findings when comparing the efficacy of online learning on student performance in relation to the traditional F2F medium of instruction (Lundberg et al.,  2008 ; Nguyen,  2015 ). A number of studies conducted Before the 2000s find what is known today in the empirical literature as the “No Significant Difference” phenomenon (Russell & International Distance Education Certificate Center (IDECC),  1999 ). The seminal work from Russell and IDECC ( 1999 ) involved over 350 comparative studies on online/distance learning versus F2F learning, dating back to 1928. The author finds no significant difference overall between online and traditional F2F classroom education outcomes. Subsequent studies that followed find similar “no significant difference” outcomes (Arbaugh,  2000 ; Fallah & Ubell,  2000 ; Freeman & Capper,  1999 ; Johnson et al.,  2000 ; Neuhauser,  2002 ). While Bernard et al. ( 2004 ) also find that overall there is no significant difference in achievement between online education and F2F education, the study does find significant heterogeneity in student performance for different activities. The findings show that students in F2F classes outperform the students participating in synchronous online classes (i.e. classes that require online students to participate in live sessions at specific times). However, asynchronous online classes (i.e. students access class materials at their own time online) outperform F2F classes.

More recent studies find significant results for online learning outcomes in relation to F2F outcomes. On the one hand, Shachar and Yoram ( 2003 ) and Shachar and Neumann ( 2010 ) conduct a meta‐analysis of studies from 1990 to 2009 and find that in 70% of the cases, students taking courses by online education outperformed students in traditionally instructed courses (i.e. F2F lectures). In addition, Navarro and Shoemaker ( 2000 ) observe that learning outcomes for online learners are as effective as or better than outcomes for F2F learners, regardless of background characteristics. In a study on computer science students, Dutton et al. ( 2002 ) find online students perform significantly better compared to the students who take the same course on campus. A meta‐analysis conducted by the US Department of Education finds that students who took all or part of their course online performed better, on average, than those taking the same course through traditional F2F instructions. The report also finds that the effect sizes are larger for studies in which the online learning was collaborative or instructor‐driven than in those studies where online learners worked independently (Means et al.,  2010 ).

On the other hand, evidence by Brown and Liedholm ( 2002 ) based on test scores from macroeconomics students in the United States suggest that F2F students tend to outperform online students. These findings are supported by Coates et al. ( 2004 ) who base their study on macroeconomics students in the United States, and Xu and Jaggars ( 2014 ) who find negative effects for online students using a data set of about 500,000 courses taken by over 40,000 students in Washington. Furthermore, Almatra et al. ( 2015 ) compare overall course grades between online and F2F students for a Telecommunications course and find that F2F students significantly outperform online learning students. In an experimental study where students are randomly assigned to attend live lectures versus watching the same lectures online, Figlio et al. ( 2013 ) observe some evidence that the traditional format has a positive effect compared to online format. Interestingly, Callister and Love ( 2016 ) specifically compare the learning outcomes of online versus F2F skills‐based courses and find that F2F learners earned better outcomes than online learners even when using the same technology. This study highlights that some of the inconsistencies that we find in the results comparing online to F2F learning might be influenced by the nature of the course: theory‐based courses might be less impacted by in‐person interaction than skills‐based courses.

The fact that the reviewed studies on the effects of F2F versus online learning on student performance have been mainly focused in developed countries indicates the dearth of similar studies being conducted in developing countries. This gap in the literature may also highlight a salient point: online learning is still relatively underexplored in developing countries. The lockdown in South Africa therefore provides us with an opportunity to contribute to the existing literature from a developing country context.

2. CONTEXT OF STUDY

South Africa went into national lockdown in March 2020 due to the Covid‐19 pandemic. Like most universities in the country, the first semester for undergraduate courses at the University of Pretoria had already been running since the start of the academic year in February. Before the pandemic, a number of F2F lectures and assessments had already been conducted in most courses. The nationwide lockdown forced the university, which was mainly in‐person teaching, to move to full online learning for the remainder of the semester. This forced shift from F2F teaching to online learning allows us to investigate the changes in students' performance.

Before lockdown, classes were conducted on campus. During lockdown, these live classes were moved to an online platform, Blackboard Collaborate, which could be accessed by all registered students on the university intranet (“ClickUP”). However, these live online lectures involve substantial internet data costs for students. To ensure access to course content for those students who were unable to attend the live online lectures due to poor internet connections or internet data costs, several options for accessing course content were made available. These options included prerecorded narrated slides (which required less usage of internet data), recordings of the live online lectures, PowerPoint slides with explanatory notes and standard PDF lecture slides.

At the same time, the university managed to procure and loan out laptops to a number of disadvantaged students, and negotiated with major mobile internet data providers in the country for students to have free access to study material through the university's “connect” website (also referred to as the zero‐rated website). However, this free access excluded some video content and live online lectures (see Table  1 ). The university also provided between 10 and 20 gigabytes of mobile internet data per month, depending on the network provider, sent to students' mobile phones to assist with internet data costs.

Sites available on zero‐rated website

Note : The table summarizes the sites that were available on the zero‐rated website and those that incurred data costs.

High data costs continue to be a contentious issue in Africa where average incomes are low. Gilbert ( 2019 ) reports that South Africa ranked 16th of the 45 countries researched in terms of the most expensive internet data in Africa, at US$6.81 per gigabyte, in comparison to other Southern African countries such as Mozambique (US$1.97), Zambia (US$2.70), and Lesotho (US$4.09). Internet data prices have also been called into question in South Africa after the Competition Commission published a report from its Data Services Market Inquiry calling the country's internet data pricing “excessive” (Gilbert,  2019 ).

3. EMPIRICAL APPROACH

We use a sample of 395 s‐year students taking a macroeconomics module in the Economics department to compare the effects of F2F and online learning on students' performance using a range of assessments. The module was an introduction to the application of theoretical economic concepts. The content was both theory‐based (developing economic growth models using concepts and equations) and skill‐based (application involving the collection of data from online data sources and analyzing the data using statistical software). Both individual and group assignments formed part of the assessments. Before the end of the semester, during lockdown in June 2020, we asked the students to complete a survey with questions related to the transition from F2F to online learning and the difficulties that they may have faced. For example, we asked the students: (i) how easy or difficult they found the transition from F2F to online lectures; (ii) what internet options were available to them and which they used the most to access the online prescribed work; (iii) what format of content they accessed and which they preferred the most (i.e. self‐study material in the form of PDF and PowerPoint slides with notes vs. assisted study with narrated slides and lecture recordings); (iv) what difficulties they faced accessing the live online lectures, to name a few. Figure  1 summarizes the key survey questions that we asked the students regarding their transition from F2F to online learning.

An external file that holds a picture, illustration, etc.
Object name is AFDR-33-S114-g002.jpg

Summary of survey data

Before the lockdown, the students had already attended several F2F classes and completed three assessments. We are therefore able to create a dependent variable that is comprised of the average grades of three assignments taken before lockdown and the average grades of three assignments taken after the start of the lockdown for each student. Specifically, we use the difference between the post‐ and pre‐lockdown average grades as the dependent variable. However, the number of student observations dropped to 275 due to some students missing one or more of the assessments. The lecturer, content and format of the assessments remain similar across the module. We estimate the following equation using ordinary least squares (OLS) with robust standard errors:

where Y i is the student's performance measured by the difference between the post and pre‐lockdown average grades. B represents the vector of determinants that measure the difficulty faced by students to transition from F2F to online learning. This vector includes access to the internet, study material preferred, quality of the online live lecture sessions and pre‐lockdown class attendance. X is the vector of student demographic controls such as race, gender and an indicator if the student's perceived family income is below average. The ε i is unobserved student characteristics.

4. ANALYSIS

4.1. descriptive statistics.

Table  2 gives an overview of the sample of students. We find that among the black students, a higher proportion of students reported finding the transition to online learning more difficult. On the other hand, more white students reported finding the transition moderately easy, as did the other races. According to Coetzee ( 2014 ), the quality of schools can vary significantly between higher income and lower‐income areas, with black South Africans far more likely to live in lower‐income areas with lower quality schools than white South Africans. As such, these differences in quality of education from secondary schooling can persist at tertiary level. Furthermore, persistent income inequality between races in South Africa likely means that many poorer black students might not be able to afford wifi connections or large internet data bundles which can make the transition difficult for black students compared to their white counterparts.

Descriptive statistics

Notes : The transition difficulty variable was ordered 1: Very Easy; 2: Moderately Easy; 3: Difficult; and 4: Impossible. Since we have few responses to the extremes, we combined Very Easy and Moderately as well as Difficult and Impossible to make the table easier to read. The table with a full breakdown is available upon request.

A higher proportion of students reported that wifi access made the transition to online learning moderately easy. However, relatively more students reported that mobile internet data and accessing the zero‐rated website made the transition difficult. Surprisingly, not many students made use of the zero‐rated website which was freely available. Figure  2 shows that students who reported difficulty transitioning to online learning did not perform as well in online learning versus F2F when compared to those that found it less difficult to transition.

An external file that holds a picture, illustration, etc.
Object name is AFDR-33-S114-g003.jpg

Transition from F2F to online learning.

Notes : This graph shows the students' responses to the question “How easy did you find the transition from face‐to‐face lectures to online lectures?” in relation to the outcome variable for performance

In Figure  3 , the kernel density shows that students who had access to wifi performed better than those who used mobile internet data or the zero‐rated data.

An external file that holds a picture, illustration, etc.
Object name is AFDR-33-S114-g001.jpg

Access to online learning.

Notes : This graph shows the students' responses to the question “What do you currently use the most to access most of your prescribed work?” in relation to the outcome variable for performance

The regression results are reported in Table  3 . We find that the change in students' performance from F2F to online is negatively associated with the difficulty they faced in transitioning from F2F to online learning. According to student survey responses, factors contributing to difficulty in transitioning included poor internet access, high internet data costs and lack of equipment such as laptops or tablets to access the study materials on the university website. Students who had access to wifi (i.e. fixed wireless broadband, Asymmetric Digital Subscriber Line (ADSL) or optic fiber) performed significantly better, with on average 4.5 points higher grade, in relation to students that had to use mobile internet data (i.e. personal mobile internet data, wifi at home using mobile internet data, or hotspot using mobile internet data) or the zero‐rated website to access the study materials. The insignificant results for the zero‐rated website are surprising given that the website was freely available and did not incur any internet data costs. However, most students in this sample complained that the internet connection on the zero‐rated website was slow, especially in uploading assignments. They also complained about being disconnected when they were in the middle of an assessment. This may have discouraged some students from making use of the zero‐rated website.

Results: Predictors for student performance using the difference on average assessment grades between pre‐ and post‐lockdown

Coefficients reported. Robust standard errors in parentheses.

∗∗∗ p  < .01.

Students who expressed a preference for self‐study approaches (i.e. reading PDF slides or PowerPoint slides with explanatory notes) did not perform as well, on average, as students who preferred assisted study (i.e. listening to recorded narrated slides or lecture recordings). This result is in line with Means et al. ( 2010 ), where student performance was better for online learning that was collaborative or instructor‐driven than in cases where online learners worked independently. Interestingly, we also observe that the performance of students who often attended in‐person classes before the lockdown decreased. Perhaps these students found the F2F lectures particularly helpful in mastering the course material. From the survey responses, we find that a significant proportion of the students (about 70%) preferred F2F to online lectures. This preference for F2F lectures may also be linked to the factors contributing to the difficulty some students faced in transitioning to online learning.

We find that the performance of low‐income students decreased post‐lockdown, which highlights another potential challenge to transitioning to online learning. The picture and sound quality of the live online lectures also contributed to lower performance. Although this result is not statistically significant, it is worth noting as the implications are linked to the quality of infrastructure currently available for students to access online learning. We find no significant effects of race on changes in students' performance, though males appeared to struggle more with the shift to online teaching than females.

For the robustness check in Table  4 , we consider the average grades of the three assignments taken after the start of the lockdown as a dependent variable (i.e. the post‐lockdown average grades for each student). We then include the pre‐lockdown average grades as an explanatory variable. The findings and overall conclusions in Table  4 are consistent with the previous results.

Robustness check: Predictors for student performance using the average assessment grades for post‐lockdown

As a further robustness check in Table  5 , we create a panel for each student across the six assignment grades so we can control for individual heterogeneity. We create a post‐lockdown binary variable that takes the value of 1 for the lockdown period and 0 otherwise. We interact the post‐lockdown dummy variable with a measure for transition difficulty and internet access. The internet access variable is an indicator variable for mobile internet data, wifi, or zero‐rated access to class materials. The variable wifi is a binary variable taking the value of 1 if the student has access to wifi and 0 otherwise. The zero‐rated variable is a binary variable taking the value of 1 if the student used the university's free portal access and 0 otherwise. We also include assignment and student fixed effects. The results in Table  5 remain consistent with our previous findings that students who had wifi access performed significantly better than their peers.

Interaction model

Notes : Coefficients reported. Robust standard errors in parentheses. The dependent variable is the assessment grades for each student on each assignment. The number of observations include the pre‐post number of assessments multiplied by the number of students.

6. CONCLUSION

The Covid‐19 pandemic left many education institutions with no option but to transition to online learning. The University of Pretoria was no exception. We examine the effect of transitioning to online learning on the academic performance of second‐year economic students. We use assessment results from F2F lectures before lockdown, and online lectures post lockdown for the same group of students, together with responses from survey questions. We find that the main contributor to lower academic performance in the online setting was poor internet access, which made transitioning to online learning more difficult. In addition, opting to self‐study (read notes instead of joining online classes and/or watching recordings) did not help the students in their performance.

The implications of the results highlight the need for improved quality of internet infrastructure with affordable internet data pricing. Despite the university's best efforts not to leave any student behind with the zero‐rated website and free monthly internet data, the inequality dynamics in the country are such that invariably some students were negatively affected by this transition, not because the student was struggling academically, but because of inaccessibility of internet (wifi). While the zero‐rated website is a good collaborative initiative between universities and network providers, the infrastructure is not sufficient to accommodate mass students accessing it simultaneously.

This study's findings may highlight some shortcomings in the academic sector that need to be addressed by both the public and private sectors. There is potential for an increase in the digital divide gap resulting from the inequitable distribution of digital infrastructure. This may lead to reinforcement of current inequalities in accessing higher education in the long term. To prepare the country for online learning, some considerations might need to be made to make internet data tariffs more affordable and internet accessible to all. We hope that this study's findings will provide a platform (or will at least start the conversation for taking remedial action) for policy engagements in this regard.

We are aware of some limitations presented by our study. The sample we have at hand makes it difficult to extrapolate our findings to either all students at the University of Pretoria or other higher education students in South Africa. Despite this limitation, our findings highlight the negative effect of the digital divide on students' educational outcomes in the country. The transition to online learning and the high internet data costs in South Africa can also have adverse learning outcomes for low‐income students. With higher education institutions, such as the University of Pretoria, integrating online teaching to overcome the effect of the Covid‐19 pandemic, access to stable internet is vital for students' academic success.

It is also important to note that the data we have at hand does not allow us to isolate wifi's causal effect on students' performance post‐lockdown due to two main reasons. First, wifi access is not randomly assigned; for instance, there is a high chance that students with better‐off family backgrounds might have better access to wifi and other supplementary infrastructure than their poor counterparts. Second, due to the university's data access policy and consent, we could not merge the data at hand with the student's previous year's performance. Therefore, future research might involve examining the importance of these elements to document the causal impact of access to wifi on students' educational outcomes in the country.

ACKNOWLEDGMENT

The authors acknowledge the helpful comments received from the editor, the anonymous reviewers, and Elizabeth Asiedu.

Chisadza, C. , Clance, M. , Mthembu, T. , Nicholls, N. , & Yitbarek, E. (2021). Online and face‐to‐face learning: Evidence from students’ performance during the Covid‐19 pandemic . Afr Dev Rev , 33 , S114–S125. 10.1111/afdr.12520 [ CrossRef ] [ Google Scholar ]

1 https://mybroadband.co.za/news/cellular/309693-mobile-data-prices-south-africa-vs-the-world.html .

2 The 4IR is currently characterized by increased use of new technologies, such as advanced wireless technologies, artificial intelligence, cloud computing, robotics, among others. This era has also facilitated the use of different online learning platforms ( https://www.brookings.edu/research/the-fourth-industrialrevolution-and-digitization-will-transform-africa-into-a-global-powerhouse/ ).

3 Note that we control for income, but it is plausible to assume other unobservable factors such as parental preference and parenting style might also affect access to the internet of students.

  • Almatra, O. , Johri, A. , Nagappan, K. , & Modanlu, A. (2015). An empirical study of face‐to‐face and distance learning sections of a core telecommunication course (Conference Proceedings Paper No. 12944). 122nd ASEE Annual Conference and Exposition, Seattle, Washington State.
  • Anyanwu, J. C. (2013). Characteristics and macroeconomic determinants of youth employment in Africa . African Development Review , 25 ( 2 ), 107–129. [ Google Scholar ]
  • Anyanwu, J. C. (2016). Accounting for gender equality in secondary school enrolment in Africa: Accounting for gender equality in secondary school enrolment . African Development Review , 28 ( 2 ), 170–191. [ Google Scholar ]
  • Arbaugh, J. (2000). Virtual classroom versus physical classroom: An exploratory study of class discussion patterns and student learning in an asynchronous internet‐based MBA course . Journal of Management Education , 24 ( 2 ), 213–233. [ Google Scholar ]
  • Arias, J. J. , Swinton, J. , & Anderson, K. (2018). On‐line vs. face‐to‐face: A comparison of student outcomes with random assignment . e‐Journal of Business Education and Scholarship of Teaching, , 12 ( 2 ), 1–23. [ Google Scholar ]
  • Bernard, R. M. , Abrami, P. C. , Lou, Y. , Borokhovski, E. , Wade, A. , Wozney, L. , Wallet, P. A. , Fiset, M. , & Huang, B. (2004). How does distance education compare with classroom instruction? A meta‐analysis of the empirical literature . Review of Educational Research , 74 ( 3 ), 379–439. [ Google Scholar ]
  • Brown, B. , & Liedholm, C. (2002). Can web courses replace the classroom in principles of microeconomics? American Economic Review , 92 ( 2 ), 444–448. [ Google Scholar ]
  • Callister, R. R. , & Love, M. S. (2016). A comparison of learning outcomes in skills‐based courses: Online versus face‐to‐face formats . Decision Sciences Journal of Innovative Education , 14 ( 2 ), 243–256. [ Google Scholar ]
  • Coates, D. , Humphreys, B. R. , Kane, J. , & Vachris, M. A. (2004). “No significant distance” between face‐to‐face and online instruction: Evidence from principles of economics . Economics of Education Review , 23 ( 5 ), 533–546. [ Google Scholar ]
  • Coetzee, M. (2014). School quality and the performance of disadvantaged learners in South Africa (Working Paper No. 22). University of Stellenbosch Economics Department, Stellenbosch
  • Dutton, J. , Dutton, M. , & Perry, J. (2002). How do online students differ from lecture students? Journal of Asynchronous Learning Networks , 6 ( 1 ), 1–20. [ Google Scholar ]
  • Emediegwu, L. (2021). Does educational investment enhance capacity development for Nigerian youths? An autoregressive distributed lag approach . African Development Review , 32 ( S1 ), S45–S53. [ Google Scholar ]
  • Fallah, M. H. , & Ubell, R. (2000). Blind scores in a graduate test. Conventional compared with web‐based outcomes . ALN Magazine , 4 ( 2 ). [ Google Scholar ]
  • Figlio, D. , Rush, M. , & Yin, L. (2013). Is it live or is it internet? Experimental estimates of the effects of online instruction on student learning . Journal of Labor Economics , 31 ( 4 ), 763–784. [ Google Scholar ]
  • Freeman, M. A. , & Capper, J. M. (1999). Exploiting the web for education: An anonymous asynchronous role simulation . Australasian Journal of Educational Technology , 15 ( 1 ), 95–116. [ Google Scholar ]
  • Gilbert, P. (2019). The most expensive data prices in Africa . Connecting Africa. https://www.connectingafrica.com/author.asp?section_id=761%26doc_id=756372
  • Glick, P. , Randriamamonjy, J. , & Sahn, D. (2009). Determinants of HIV knowledge and condom use among women in Madagascar: An analysis using matched household and community data . African Development Review , 21 ( 1 ), 147–179. [ Google Scholar ]
  • Gyimah‐Brempong, K. (2011). Education and economic development in Africa . African Development Review , 23 ( 2 ), 219–236. [ Google Scholar ]
  • Johnson, S. , Aragon, S. , Shaik, N. , & Palma‐Rivas, N. (2000). Comparative analysis of learner satisfaction and learning outcomes in online and face‐to‐face learning environments . Journal of Interactive Learning Research , 11 ( 1 ), 29–49. [ Google Scholar ]
  • Lundberg, J. , Merino, D. , & Dahmani, M. (2008). Do online students perform better than face‐to‐face students? Reflections and a short review of some empirical findings . Revista de Universidad y Sociedad del Conocimiento , 5 ( 1 ), 35–44. [ Google Scholar ]
  • Means, B. , Toyama, Y. , Murphy, R. , Bakia, M. , & Jones, K. (2010). Evaluation of evidence‐based practices in online learning: A meta‐analysis and review of online learning studies (Report No. ed‐04‐co‐0040 task 0006). U.S. Department of Education, Office of Planning, Evaluation, and Policy Development, Washington DC.
  • Navarro, P. (2000). Economics in the cyber‐classroom . Journal of Economic Perspectives , 14 ( 2 ), 119–132. [ Google Scholar ]
  • Navarro, P. , & Shoemaker, J. (2000). Performance and perceptions of distance learners in cyberspace . American Journal of Distance Education , 14 ( 2 ), 15–35. [ Google Scholar ]
  • Neuhauser, C. (2002). Learning style and effectiveness of online and face‐to‐face instruction . American Journal of Distance Education , 16 ( 2 ), 99–113. [ Google Scholar ]
  • Nguyen, T. (2015). The effectiveness of online learning: Beyond no significant difference and future horizons . MERLOT Journal of Online Teaching and Learning , 11 ( 2 ), 309–319. [ Google Scholar ]
  • Paechter, M. , & Maier, B. (2010). Online or face‐to‐face? Students' experiences and preferences in e‐learning . Internet and Higher Education , 13 ( 4 ), 292–297. [ Google Scholar ]
  • Russell, T. L. , & International Distance Education Certificate Center (IDECC) (1999). The no significant difference phenomenon: A comparative research annotated bibliography on technology for distance education: As reported in 355 research reports, summaries and papers . North Carolina State University. [ Google Scholar ]
  • Shachar, M. , & Neumann, Y. (2010). Twenty years of research on the academic performance differences between traditional and distance learning: Summative meta‐analysis and trend examination . MERLOT Journal of Online Learning and Teaching , 6 ( 2 ), 318–334. [ Google Scholar ]
  • Shachar, M. , & Yoram, N. (2003). Differences between traditional and distance education academic performances: A meta‐analytic approach . International Review of Research in Open and Distance Learning , 4 ( 2 ), 1–20. [ Google Scholar ]
  • Tchamyou, V. S. , Asongu, S. , & Odhiambo, N. (2019). The role of ICT in modulating the effect of education and lifelong learning on income inequality and economic growth in Africa . African Development Review , 31 ( 3 ), 261–274. [ Google Scholar ]
  • Xu, D. , & Jaggars, S. S. (2014). Performance gaps between online and face‐to‐face courses: Differences across types of students and academic subject areas . The Journal of Higher Education , 85 ( 5 ), 633–659. [ Google Scholar ]

Advertisement

Advertisement

The effects of online education on academic success: A meta-analysis study

  • Published: 06 September 2021
  • Volume 27 , pages 429–450, ( 2022 )

Cite this article

research questions online learning

  • Hakan Ulum   ORCID: orcid.org/0000-0002-1398-6935 1  

80k Accesses

26 Citations

11 Altmetric

Explore all metrics

The purpose of this study is to analyze the effect of online education, which has been extensively used on student achievement since the beginning of the pandemic. In line with this purpose, a meta-analysis of the related studies focusing on the effect of online education on students’ academic achievement in several countries between the years 2010 and 2021 was carried out. Furthermore, this study will provide a source to assist future studies with comparing the effect of online education on academic achievement before and after the pandemic. This meta-analysis study consists of 27 studies in total. The meta-analysis involves the studies conducted in the USA, Taiwan, Turkey, China, Philippines, Ireland, and Georgia. The studies included in the meta-analysis are experimental studies, and the total sample size is 1772. In the study, the funnel plot, Duval and Tweedie’s Trip and Fill Analysis, Orwin’s Safe N Analysis, and Egger’s Regression Test were utilized to determine the publication bias, which has been found to be quite low. Besides, Hedge’s g statistic was employed to measure the effect size for the difference between the means performed in accordance with the random effects model. The results of the study show that the effect size of online education on academic achievement is on a medium level. The heterogeneity test results of the meta-analysis study display that the effect size does not differ in terms of class level, country, online education approaches, and lecture moderators.

Avoid common mistakes on your manuscript.

1 Introduction

Information and communication technologies have become a powerful force in transforming the educational settings around the world. The pandemic has been an important factor in transferring traditional physical classrooms settings through adopting information and communication technologies and has also accelerated the transformation. The literature supports that learning environments connected to information and communication technologies highly satisfy students. Therefore, we need to keep interest in technology-based learning environments. Clearly, technology has had a huge impact on young people's online lives. This digital revolution can synergize the educational ambitions and interests of digitally addicted students. In essence, COVID-19 has provided us with an opportunity to embrace online learning as education systems have to keep up with the rapid emergence of new technologies.

Information and communication technologies that have an effect on all spheres of life are also actively included in the education field. With the recent developments, using technology in education has become inevitable due to personal and social reasons (Usta, 2011a ). Online education may be given as an example of using information and communication technologies as a consequence of the technological developments. Also, it is crystal clear that online learning is a popular way of obtaining instruction (Demiralay et al., 2016 ; Pillay et al., 2007 ), which is defined by Horton ( 2000 ) as a way of education that is performed through a web browser or an online application without requiring an extra software or a learning source. Furthermore, online learning is described as a way of utilizing the internet to obtain the related learning sources during the learning process, to interact with the content, the teacher, and other learners, as well as to get support throughout the learning process (Ally, 2004 ). Online learning has such benefits as learning independently at any time and place (Vrasidas & MsIsaac, 2000 ), granting facility (Poole, 2000 ), flexibility (Chizmar & Walbert, 1999 ), self-regulation skills (Usta, 2011b ), learning with collaboration, and opportunity to plan self-learning process.

Even though online education practices have not been comprehensive as it is now, internet and computers have been used in education as alternative learning tools in correlation with the advances in technology. The first distance education attempt in the world was initiated by the ‘Steno Courses’ announcement published in Boston newspaper in 1728. Furthermore, in the nineteenth century, Sweden University started the “Correspondence Composition Courses” for women, and University Correspondence College was afterwards founded for the correspondence courses in 1843 (Arat & Bakan, 2011 ). Recently, distance education has been performed through computers, assisted by the facilities of the internet technologies, and soon, it has evolved into a mobile education practice that is emanating from progress in the speed of internet connection, and the development of mobile devices.

With the emergence of pandemic (Covid-19), face to face education has almost been put to a halt, and online education has gained significant importance. The Microsoft management team declared to have 750 users involved in the online education activities on the 10 th March, just before the pandemic; however, on March 24, they informed that the number of users increased significantly, reaching the number of 138,698 users (OECD, 2020 ). This event supports the view that it is better to commonly use online education rather than using it as a traditional alternative educational tool when students do not have the opportunity to have a face to face education (Geostat, 2019 ). The period of Covid-19 pandemic has emerged as a sudden state of having limited opportunities. Face to face education has stopped in this period for a long time. The global spread of Covid-19 affected more than 850 million students all around the world, and it caused the suspension of face to face education. Different countries have proposed several solutions in order to maintain the education process during the pandemic. Schools have had to change their curriculum, and many countries supported the online education practices soon after the pandemic. In other words, traditional education gave its way to online education practices. At least 96 countries have been motivated to access online libraries, TV broadcasts, instructions, sources, video lectures, and online channels (UNESCO, 2020 ). In such a painful period, educational institutions went through online education practices by the help of huge companies such as Microsoft, Google, Zoom, Skype, FaceTime, and Slack. Thus, online education has been discussed in the education agenda more intensively than ever before.

Although online education approaches were not used as comprehensively as it has been used recently, it was utilized as an alternative learning approach in education for a long time in parallel with the development of technology, internet and computers. The academic achievement of the students is often aimed to be promoted by employing online education approaches. In this regard, academicians in various countries have conducted many studies on the evaluation of online education approaches and published the related results. However, the accumulation of scientific data on online education approaches creates difficulties in keeping, organizing and synthesizing the findings. In this research area, studies are being conducted at an increasing rate making it difficult for scientists to be aware of all the research outside of their ​​expertise. Another problem encountered in the related study area is that online education studies are repetitive. Studies often utilize slightly different methods, measures, and/or examples to avoid duplication. This erroneous approach makes it difficult to distinguish between significant differences in the related results. In other words, if there are significant differences in the results of the studies, it may be difficult to express what variety explains the differences in these results. One obvious solution to these problems is to systematically review the results of various studies and uncover the sources. One method of performing such systematic syntheses is the application of meta-analysis which is a methodological and statistical approach to draw conclusions from the literature. At this point, how effective online education applications are in increasing the academic success is an important detail. Has online education, which is likely to be encountered frequently in the continuing pandemic period, been successful in the last ten years? If successful, how much was the impact? Did different variables have an impact on this effect? Academics across the globe have carried out studies on the evaluation of online education platforms and publishing the related results (Chiao et al., 2018 ). It is quite important to evaluate the results of the studies that have been published up until now, and that will be published in the future. Has the online education been successful? If it has been, how big is the impact? Do the different variables affect this impact? What should we consider in the next coming online education practices? These questions have all motivated us to carry out this study. We have conducted a comprehensive meta-analysis study that tries to provide a discussion platform on how to develop efficient online programs for educators and policy makers by reviewing the related studies on online education, presenting the effect size, and revealing the effect of diverse variables on the general impact.

There have been many critical discussions and comprehensive studies on the differences between online and face to face learning; however, the focus of this paper is different in the sense that it clarifies the magnitude of the effect of online education and teaching process, and it represents what factors should be controlled to help increase the effect size. Indeed, the purpose here is to provide conscious decisions in the implementation of the online education process.

The general impact of online education on the academic achievement will be discovered in the study. Therefore, this will provide an opportunity to get a general overview of the online education which has been practiced and discussed intensively in the pandemic period. Moreover, the general impact of online education on academic achievement will be analyzed, considering different variables. In other words, the current study will allow to totally evaluate the study results from the related literature, and to analyze the results considering several cultures, lectures, and class levels. Considering all the related points, this study seeks to answer the following research questions:

What is the effect size of online education on academic achievement?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the country?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the class level?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the lecture?

How do the effect sizes of online education on academic achievement change according to the moderator variable of the online education approaches?

This study aims at determining the effect size of online education, which has been highly used since the beginning of the pandemic, on students’ academic achievement in different courses by using a meta-analysis method. Meta-analysis is a synthesis method that enables gathering of several study results accurately and efficiently, and getting the total results in the end (Tsagris & Fragkos, 2018 ).

2.1 Selecting and coding the data (studies)

The required literature for the meta-analysis study was reviewed in July, 2020, and the follow-up review was conducted in September, 2020. The purpose of the follow-up review was to include the studies which were published in the conduction period of this study, and which met the related inclusion criteria. However, no study was encountered to be included in the follow-up review.

In order to access the studies in the meta-analysis, the databases of Web of Science, ERIC, and SCOPUS were reviewed by utilizing the keywords ‘online learning and online education’. Not every database has a search engine that grants access to the studies by writing the keywords, and this obstacle was considered to be an important problem to be overcome. Therefore, a platform that has a special design was utilized by the researcher. With this purpose, through the open access system of Cukurova University Library, detailed reviews were practiced using EBSCO Information Services (EBSCO) that allow reviewing the whole collection of research through a sole searching box. Since the fundamental variables of this study are online education and online learning, the literature was systematically reviewed in the related databases (Web of Science, ERIC, and SCOPUS) by referring to the keywords. Within this scope, 225 articles were accessed, and the studies were included in the coding key list formed by the researcher. The name of the researchers, the year, the database (Web of Science, ERIC, and SCOPUS), the sample group and size, the lectures that the academic achievement was tested in, the country that the study was conducted in, and the class levels were all included in this coding key.

The following criteria were identified to include 225 research studies which were coded based on the theoretical basis of the meta-analysis study: (1) The studies should be published in the refereed journals between the years 2020 and 2021, (2) The studies should be experimental studies that try to determine the effect of online education and online learning on academic achievement, (3) The values of the stated variables or the required statistics to calculate these values should be stated in the results of the studies, and (4) The sample group of the study should be at a primary education level. These criteria were also used as the exclusion criteria in the sense that the studies that do not meet the required criteria were not included in the present study.

After the inclusion criteria were determined, a systematic review process was conducted, following the year criterion of the study by means of EBSCO. Within this scope, 290,365 studies that analyze the effect of online education and online learning on academic achievement were accordingly accessed. The database (Web of Science, ERIC, and SCOPUS) was also used as a filter by analyzing the inclusion criteria. Hence, the number of the studies that were analyzed was 58,616. Afterwards, the keyword ‘primary education’ was used as the filter and the number of studies included in the study decreased to 3152. Lastly, the literature was reviewed by using the keyword ‘academic achievement’ and 225 studies were accessed. All the information of 225 articles was included in the coding key.

It is necessary for the coders to review the related studies accurately and control the validity, safety, and accuracy of the studies (Stewart & Kamins, 2001 ). Within this scope, the studies that were determined based on the variables used in this study were first reviewed by three researchers from primary education field, then the accessed studies were combined and processed in the coding key by the researcher. All these studies that were processed in the coding key were analyzed in accordance with the inclusion criteria by all the researchers in the meetings, and it was decided that 27 studies met the inclusion criteria (Atici & Polat, 2010 ; Carreon, 2018 ; Ceylan & Elitok Kesici, 2017 ; Chae & Shin, 2016 ; Chiang et al. 2014 ; Ercan, 2014 ; Ercan et al., 2016 ; Gwo-Jen et al., 2018 ; Hayes & Stewart, 2016 ; Hwang et al., 2012 ; Kert et al., 2017 ; Lai & Chen, 2010 ; Lai et al., 2015 ; Meyers et al., 2015 ; Ravenel et al., 2014 ; Sung et al., 2016 ; Wang & Chen, 2013 ; Yu, 2019 ; Yu & Chen, 2014 ; Yu & Pan, 2014 ; Yu et al., 2010 ; Zhong et al., 2017 ). The data from the studies meeting the inclusion criteria were independently processed in the second coding key by three researchers, and consensus meetings were arranged for further discussion. After the meetings, researchers came to an agreement that the data were coded accurately and precisely. Having identified the effect sizes and heterogeneity of the study, moderator variables that will show the differences between the effect sizes were determined. The data related to the determined moderator variables were added to the coding key by three researchers, and a new consensus meeting was arranged. After the meeting, researchers came to an agreement that moderator variables were coded accurately and precisely.

2.2 Study group

27 studies are included in the meta-analysis. The total sample size of the studies that are included in the analysis is 1772. The characteristics of the studies included are given in Table 1 .

2.3 Publication bias

Publication bias is the low capability of published studies on a research subject to represent all completed studies on the same subject (Card, 2011 ; Littell et al., 2008 ). Similarly, publication bias is the state of having a relationship between the probability of the publication of a study on a subject, and the effect size and significance that it produces. Within this scope, publication bias may occur when the researchers do not want to publish the study as a result of failing to obtain the expected results, or not being approved by the scientific journals, and consequently not being included in the study synthesis (Makowski et al., 2019 ). The high possibility of publication bias in a meta-analysis study negatively affects (Pecoraro, 2018 ) the accuracy of the combined effect size, causing the average effect size to be reported differently than it should be (Borenstein et al., 2009 ). For this reason, the possibility of publication bias in the included studies was tested before determining the effect sizes of the relationships between the stated variables. The possibility of publication bias of this meta-analysis study was analyzed by using the funnel plot, Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test.

2.4 Selecting the model

After determining the probability of publication bias of this meta-analysis study, the statistical model used to calculate the effect sizes was selected. The main approaches used in the effect size calculations according to the differentiation level of inter-study variance are fixed and random effects models (Pigott, 2012 ). Fixed effects model refers to the homogeneity of the characteristics of combined studies apart from the sample sizes, while random effects model refers to the parameter diversity between the studies (Cumming, 2012 ). While calculating the average effect size in the random effects model (Deeks et al., 2008 ) that is based on the assumption that effect predictions of different studies are only the result of a similar distribution, it is necessary to consider several situations such as the effect size apart from the sample error of combined studies, characteristics of the participants, duration, scope, and pattern of the study (Littell et al., 2008 ). While deciding the model in the meta-analysis study, the assumptions on the sample characteristics of the studies included in the analysis and the inferences that the researcher aims to make should be taken into consideration. The fact that the sample characteristics of the studies conducted in the field of social sciences are affected by various parameters shows that using random effects model is more appropriate in this sense. Besides, it is stated that the inferences made with the random effects model are beyond the studies included in the meta-analysis (Field, 2003 ; Field & Gillett, 2010 ). Therefore, using random effects model also contributes to the generalization of research data. The specified criteria for the statistical model selection show that according to the nature of the meta-analysis study, the model should be selected just before the analysis (Borenstein et al., 2007 ; Littell et al., 2008 ). Within this framework, it was decided to make use of the random effects model, considering that the students who are the samples of the studies included in the meta-analysis are from different countries and cultures, the sample characteristics of the studies differ, and the patterns and scopes of the studies vary as well.

2.5 Heterogeneity

Meta-analysis facilitates analyzing the research subject with different parameters by showing the level of diversity between the included studies. Within this frame, whether there is a heterogeneous distribution between the studies included in the study or not has been evaluated in the present study. The heterogeneity of the studies combined in this meta-analysis study has been determined through Q and I 2 tests. Q test evaluates the random distribution probability of the differences between the observed results (Deeks et al., 2008 ). Q value exceeding 2 value calculated according to the degree of freedom and significance, indicates the heterogeneity of the combined effect sizes (Card, 2011 ). I 2 test, which is the complementary of the Q test, shows the heterogeneity amount of the effect sizes (Cleophas & Zwinderman, 2017 ). I 2 value being higher than 75% is explained as high level of heterogeneity.

In case of encountering heterogeneity in the studies included in the meta-analysis, the reasons of heterogeneity can be analyzed by referring to the study characteristics. The study characteristics which may be related to the heterogeneity between the included studies can be interpreted through subgroup analysis or meta-regression analysis (Deeks et al., 2008 ). While determining the moderator variables, the sufficiency of the number of variables, the relationship between the moderators, and the condition to explain the differences between the results of the studies have all been considered in the present study. Within this scope, it was predicted in this meta-analysis study that the heterogeneity can be explained with the country, class level, and lecture moderator variables of the study in terms of the effect of online education, which has been highly used since the beginning of the pandemic, and it has an impact on the students’ academic achievement in different lectures. Some subgroups were evaluated and categorized together, considering that the number of effect sizes of the sub-dimensions of the specified variables is not sufficient to perform moderator analysis (e.g. the countries where the studies were conducted).

2.6 Interpreting the effect sizes

Effect size is a factor that shows how much the independent variable affects the dependent variable positively or negatively in each included study in the meta-analysis (Dinçer, 2014 ). While interpreting the effect sizes obtained from the meta-analysis, the classifications of Cohen et al. ( 2007 ) have been utilized. The case of differentiating the specified relationships of the situation of the country, class level, and school subject variables of the study has been identified through the Q test, degree of freedom, and p significance value Fig.  1 and 2 .

3 Findings and results

The purpose of this study is to determine the effect size of online education on academic achievement. Before determining the effect sizes in the study, the probability of publication bias of this meta-analysis study was analyzed by using the funnel plot, Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test.

When the funnel plots are examined, it is seen that the studies included in the analysis are distributed symmetrically on both sides of the combined effect size axis, and they are generally collected in the middle and lower sections. The probability of publication bias is low according to the plots. However, since the results of the funnel scatter plots may cause subjective interpretations, they have been supported by additional analyses (Littell et al., 2008 ). Therefore, in order to provide an extra proof for the probability of publication bias, it has been analyzed through Orwin’s Safe N Analysis, Duval and Tweedie’s Trip and Fill Analysis, and Egger’s Regression Test (Table 2 ).

Table 2 consists of the results of the rates of publication bias probability before counting the effect size of online education on academic achievement. According to the table, Orwin Safe N analysis results show that it is not necessary to add new studies to the meta-analysis in order for Hedges g to reach a value outside the range of ± 0.01. The Duval and Tweedie test shows that excluding the studies that negatively affect the symmetry of the funnel scatter plots for each meta-analysis or adding their exact symmetrical equivalents does not significantly differentiate the calculated effect size. The insignificance of the Egger tests results reveals that there is no publication bias in the meta-analysis study. The results of the analysis indicate the high internal validity of the effect sizes and the adequacy of representing the studies conducted on the relevant subject.

In this study, it was aimed to determine the effect size of online education on academic achievement after testing the publication bias. In line with the first purpose of the study, the forest graph regarding the effect size of online education on academic achievement is shown in Fig.  3 , and the statistics regarding the effect size are given in Table 3 .

figure 1

The flow chart of the scanning and selection process of the studies

figure 2

Funnel plot graphics representing the effect size of the effects of online education on academic success

figure 3

Forest graph related to the effect size of online education on academic success

The square symbols in the forest graph in Fig.  3 represent the effect sizes, while the horizontal lines show the intervals in 95% confidence of the effect sizes, and the diamond symbol shows the overall effect size. When the forest graph is analyzed, it is seen that the lower and upper limits of the combined effect sizes are generally close to each other, and the study loads are similar. This similarity in terms of study loads indicates the similarity of the contribution of the combined studies to the overall effect size.

Figure  3 clearly represents that the study of Liu and others (Liu et al., 2018 ) has the lowest, and the study of Ercan and Bilen ( 2014 ) has the highest effect sizes. The forest graph shows that all the combined studies and the overall effect are positive. Furthermore, it is simply understood from the forest graph in Fig.  3 and the effect size statistics in Table 3 that the results of the meta-analysis study conducted with 27 studies and analyzing the effect of online education on academic achievement illustrate that this relationship is on average level (= 0.409).

After the analysis of the effect size in the study, whether the studies included in the analysis are distributed heterogeneously or not has also been analyzed. The heterogeneity of the combined studies was determined through the Q and I 2 tests. As a result of the heterogeneity test, Q statistical value was calculated as 29.576. With 26 degrees of freedom at 95% significance level in the chi-square table, the critical value is accepted as 38.885. The Q statistical value (29.576) counted in this study is lower than the critical value of 38.885. The I 2 value, which is the complementary of the Q statistics, is 12.100%. This value indicates that the accurate heterogeneity or the total variability that can be attributed to variability between the studies is 12%. Besides, p value is higher than (0.285) p = 0.05. All these values [Q (26) = 29.579, p = 0.285; I2 = 12.100] indicate that there is a homogeneous distribution between the effect sizes, and fixed effects model should be used to interpret these effect sizes. However, some researchers argue that even if the heterogeneity is low, it should be evaluated based on the random effects model (Borenstein et al., 2007 ). Therefore, this study gives information about both models. The heterogeneity of the combined studies has been attempted to be explained with the characteristics of the studies included in the analysis. In this context, the final purpose of the study is to determine the effect of the country, academic level, and year variables on the findings. Accordingly, the statistics regarding the comparison of the stated relations according to the countries where the studies were conducted are given in Table 4 .

As seen in Table 4 , the effect of online education on academic achievement does not differ significantly according to the countries where the studies were conducted in. Q test results indicate the heterogeneity of the relationships between the variables in terms of countries where the studies were conducted in. According to the table, the effect of online education on academic achievement was reported as the highest in other countries, and the lowest in the US. The statistics regarding the comparison of the stated relations according to the class levels are given in Table 5 .

As seen in Table 5 , the effect of online education on academic achievement does not differ according to the class level. However, the effect of online education on academic achievement is the highest in the 4 th class. The statistics regarding the comparison of the stated relations according to the class levels are given in Table 6 .

As seen in Table 6 , the effect of online education on academic achievement does not differ according to the school subjects included in the studies. However, the effect of online education on academic achievement is the highest in ICT subject.

The obtained effect size in the study was formed as a result of the findings attained from primary studies conducted in 7 different countries. In addition, these studies are the ones on different approaches to online education (online learning environments, social networks, blended learning, etc.). In this respect, the results may raise some questions about the validity and generalizability of the results of the study. However, the moderator analyzes, whether for the country variable or for the approaches covered by online education, did not create significant differences in terms of the effect sizes. If significant differences were to occur in terms of effect sizes, we could say that the comparisons we will make by comparing countries under the umbrella of online education would raise doubts in terms of generalizability. Moreover, no study has been found in the literature that is not based on a special approach or does not contain a specific technique conducted under the name of online education alone. For instance, one of the commonly used definitions is blended education which is defined as an educational model in which online education is combined with traditional education method (Colis & Moonen, 2001 ). Similarly, Rasmussen ( 2003 ) defines blended learning as “a distance education method that combines technology (high technology such as television, internet, or low technology such as voice e-mail, conferences) with traditional education and training.” Further, Kerres and Witt (2003) define blended learning as “combining face-to-face learning with technology-assisted learning.” As it is clearly observed, online education, which has a wider scope, includes many approaches.

As seen in Table 7 , the effect of online education on academic achievement does not differ according to online education approaches included in the studies. However, the effect of online education on academic achievement is the highest in Web Based Problem Solving Approach.

4 Conclusions and discussion

Considering the developments during the pandemics, it is thought that the diversity in online education applications as an interdisciplinary pragmatist field will increase, and the learning content and processes will be enriched with the integration of new technologies into online education processes. Another prediction is that more flexible and accessible learning opportunities will be created in online education processes, and in this way, lifelong learning processes will be strengthened. As a result, it is predicted that in the near future, online education and even digital learning with a newer name will turn into the main ground of education instead of being an alternative or having a support function in face-to-face learning. The lessons learned from the early period online learning experience, which was passed with rapid adaptation due to the Covid19 epidemic, will serve to develop this method all over the world, and in the near future, online learning will become the main learning structure through increasing its functionality with the contribution of new technologies and systems. If we look at it from this point of view, there is a necessity to strengthen online education.

In this study, the effect of online learning on academic achievement is at a moderate level. To increase this effect, the implementation of online learning requires support from teachers to prepare learning materials, to design learning appropriately, and to utilize various digital-based media such as websites, software technology and various other tools to support the effectiveness of online learning (Rolisca & Achadiyah, 2014 ). According to research conducted by Rahayu et al. ( 2017 ), it has been proven that the use of various types of software increases the effectiveness and quality of online learning. Implementation of online learning can affect students' ability to adapt to technological developments in that it makes students use various learning resources on the internet to access various types of information, and enables them to get used to performing inquiry learning and active learning (Hart et al., 2019 ; Prestiadi et al., 2019 ). In addition, there may be many reasons for the low level of effect in this study. The moderator variables examined in this study could be a guide in increasing the level of practical effect. However, the effect size did not differ significantly for all moderator variables. Different moderator analyzes can be evaluated in order to increase the level of impact of online education on academic success. If confounding variables that significantly change the effect level are detected, it can be spoken more precisely in order to increase this level. In addition to the technical and financial problems, the level of impact will increase if a few other difficulties are eliminated such as students, lack of interaction with the instructor, response time, and lack of traditional classroom socialization.

In addition, COVID-19 pandemic related social distancing has posed extreme difficulties for all stakeholders to get online as they have to work in time constraints and resource constraints. Adopting the online learning environment is not just a technical issue, it is a pedagogical and instructive challenge as well. Therefore, extensive preparation of teaching materials, curriculum, and assessment is vital in online education. Technology is the delivery tool and requires close cross-collaboration between teaching, content and technology teams (CoSN, 2020 ).

Online education applications have been used for many years. However, it has come to the fore more during the pandemic process. This result of necessity has brought with it the discussion of using online education instead of traditional education methods in the future. However, with this research, it has been revealed that online education applications are moderately effective. The use of online education instead of face-to-face education applications can only be possible with an increase in the level of success. This may have been possible with the experience and knowledge gained during the pandemic process. Therefore, the meta-analysis of experimental studies conducted in the coming years will guide us. In this context, experimental studies using online education applications should be analyzed well. It would be useful to identify variables that can change the level of impacts with different moderators. Moderator analyzes are valuable in meta-analysis studies (for example, the role of moderators in Karl Pearson's typhoid vaccine studies). In this context, each analysis study sheds light on future studies. In meta-analyses to be made about online education, it would be beneficial to go beyond the moderators determined in this study. Thus, the contribution of similar studies to the field will increase more.

The purpose of this study is to determine the effect of online education on academic achievement. In line with this purpose, the studies that analyze the effect of online education approaches on academic achievement have been included in the meta-analysis. The total sample size of the studies included in the meta-analysis is 1772. While the studies included in the meta-analysis were conducted in the US, Taiwan, Turkey, China, Philippines, Ireland, and Georgia, the studies carried out in Europe could not be reached. The reason may be attributed to that there may be more use of quantitative research methods from a positivist perspective in the countries with an American academic tradition. As a result of the study, it was found out that the effect size of online education on academic achievement (g = 0.409) was moderate. In the studies included in the present research, we found that online education approaches were more effective than traditional ones. However, contrary to the present study, the analysis of comparisons between online and traditional education in some studies shows that face-to-face traditional learning is still considered effective compared to online learning (Ahmad et al., 2016 ; Hamdani & Priatna, 2020 ; Wei & Chou, 2020 ). Online education has advantages and disadvantages. The advantages of online learning compared to face-to-face learning in the classroom is the flexibility of learning time in online learning, the learning time does not include a single program, and it can be shaped according to circumstances (Lai et al., 2019 ). The next advantage is the ease of collecting assignments for students, as these can be done without having to talk to the teacher. Despite this, online education has several weaknesses, such as students having difficulty in understanding the material, teachers' inability to control students, and students’ still having difficulty interacting with teachers in case of internet network cuts (Swan, 2007 ). According to Astuti et al ( 2019 ), face-to-face education method is still considered better by students than e-learning because it is easier to understand the material and easier to interact with teachers. The results of the study illustrated that the effect size (g = 0.409) of online education on academic achievement is of medium level. Therefore, the results of the moderator analysis showed that the effect of online education on academic achievement does not differ in terms of country, lecture, class level, and online education approaches variables. After analyzing the literature, several meta-analyses on online education were published (Bernard et al., 2004 ; Machtmes & Asher, 2000 ; Zhao et al., 2005 ). Typically, these meta-analyzes also include the studies of older generation technologies such as audio, video, or satellite transmission. One of the most comprehensive studies on online education was conducted by Bernard et al. ( 2004 ). In this study, 699 independent effect sizes of 232 studies published from 1985 to 2001 were analyzed, and face-to-face education was compared to online education, with respect to success criteria and attitudes of various learners from young children to adults. In this meta-analysis, an overall effect size close to zero was found for the students' achievement (g +  = 0.01).

In another meta-analysis study carried out by Zhao et al. ( 2005 ), 98 effect sizes were examined, including 51 studies on online education conducted between 1996 and 2002. According to the study of Bernard et al. ( 2004 ), this meta-analysis focuses on the activities done in online education lectures. As a result of the research, an overall effect size close to zero was found for online education utilizing more than one generation technology for students at different levels. However, the salient point of the meta-analysis study of Zhao et al. is that it takes the average of different types of results used in a study to calculate an overall effect size. This practice is problematic because the factors that develop one type of learner outcome (e.g. learner rehabilitation), particularly course characteristics and practices, may be quite different from those that develop another type of outcome (e.g. learner's achievement), and it may even cause damage to the latter outcome. While mixing the studies with different types of results, this implementation may obscure the relationship between practices and learning.

Some meta-analytical studies have focused on the effectiveness of the new generation distance learning courses accessed through the internet for specific student populations. For instance, Sitzmann and others (Sitzmann et al., 2006 ) reviewed 96 studies published from 1996 to 2005, comparing web-based education of job-related knowledge or skills with face-to-face one. The researchers found that web-based education in general was slightly more effective than face-to-face education, but it is insufficient in terms of applicability ("knowing how to apply"). In addition, Sitzmann et al. ( 2006 ) revealed that Internet-based education has a positive effect on theoretical knowledge in quasi-experimental studies; however, it positively affects face-to-face education in experimental studies performed by random assignment. This moderator analysis emphasizes the need to pay attention to the factors of designs of the studies included in the meta-analysis. The designs of the studies included in this meta-analysis study were ignored. This can be presented as a suggestion to the new studies that will be conducted.

Another meta-analysis study was conducted by Cavanaugh et al. ( 2004 ), in which they focused on online education. In this study on internet-based distance education programs for students under 12 years of age, the researchers combined 116 results from 14 studies published between 1999 and 2004 to calculate an overall effect that was not statistically different from zero. The moderator analysis carried out in this study showed that there was no significant factor affecting the students' success. This meta-analysis used multiple results of the same study, ignoring the fact that different results of the same student would not be independent from each other.

In conclusion, some meta-analytical studies analyzed the consequences of online education for a wide range of students (Bernard et al., 2004 ; Zhao et al., 2005 ), and the effect sizes were generally low in these studies. Furthermore, none of the large-scale meta-analyzes considered the moderators, database quality standards or class levels in the selection of the studies, while some of them just referred to the country and lecture moderators. Advances in internet-based learning tools, the pandemic process, and increasing popularity in different learning contexts have required a precise meta-analysis of students' learning outcomes through online learning. Previous meta-analysis studies were typically based on the studies, involving narrow range of confounding variables. In the present study, common but significant moderators such as class level and lectures during the pandemic process were discussed. For instance, the problems have been experienced especially in terms of eligibility of class levels in online education platforms during the pandemic process. It was found that there is a need to study and make suggestions on whether online education can meet the needs of teachers and students.

Besides, the main forms of online education in the past were to watch the open lectures of famous universities and educational videos of institutions. In addition, online education is mainly a classroom-based teaching implemented by teachers in their own schools during the pandemic period, which is an extension of the original school education. This meta-analysis study will stand as a source to compare the effect size of the online education forms of the past decade with what is done today, and what will be done in the future.

Lastly, the heterogeneity test results of the meta-analysis study display that the effect size does not differ in terms of class level, country, online education approaches, and lecture moderators.

*Studies included in meta-analysis

Ahmad, S., Sumardi, K., & Purnawan, P. (2016). Komparasi Peningkatan Hasil Belajar Antara Pembelajaran Menggunakan Sistem Pembelajaran Online Terpadu Dengan Pembelajaran Klasikal Pada Mata Kuliah Pneumatik Dan Hidrolik. Journal of Mechanical Engineering Education, 2 (2), 286–292.

Article   Google Scholar  

Ally, M. (2004). Foundations of educational theory for online learning. Theory and Practice of Online Learning, 2 , 15–44. Retrieved on the 11th of September, 2020 from https://eddl.tru.ca/wp-content/uploads/2018/12/01_Anderson_2008-Theory_and_Practice_of_Online_Learning.pdf

Arat, T., & Bakan, Ö. (2011). Uzaktan eğitim ve uygulamaları. Selçuk Üniversitesi Sosyal Bilimler Meslek Yüksek Okulu Dergisi , 14 (1–2), 363–374. https://doi.org/10.29249/selcuksbmyd.540741

Astuti, C. C., Sari, H. M. K., & Azizah, N. L. (2019). Perbandingan Efektifitas Proses Pembelajaran Menggunakan Metode E-Learning dan Konvensional. Proceedings of the ICECRS, 2 (1), 35–40.

*Atici, B., & Polat, O. C. (2010). Influence of the online learning environments and tools on the student achievement and opinions. Educational Research and Reviews, 5 (8), 455–464. Retrieved on the 11th of October, 2020 from https://academicjournals.org/journal/ERR/article-full-text-pdf/4C8DD044180.pdf

Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., et al. (2004). How does distance education compare with classroom instruction? A meta- analysis of the empirical literature. Review of Educational Research, 3 (74), 379–439. https://doi.org/10.3102/00346543074003379

Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to meta-analysis . Wiley.

Book   Google Scholar  

Borenstein, M., Hedges, L., & Rothstein, H. (2007). Meta-analysis: Fixed effect vs. random effects . UK: Wiley.

Card, N. A. (2011). Applied meta-analysis for social science research: Methodology in the social sciences . Guilford.

Google Scholar  

*Carreon, J. R. (2018 ). Facebook as integrated blended learning tool in technology and livelihood education exploratory. Retrieved on the 1st of October, 2020 from https://files.eric.ed.gov/fulltext/EJ1197714.pdf

Cavanaugh, C., Gillan, K. J., Kromrey, J., Hess, M., & Blomeyer, R. (2004). The effects of distance education on K-12 student outcomes: A meta-analysis. Learning Point Associates/North Central Regional Educational Laboratory (NCREL) . Retrieved on the 11th of September, 2020 from https://files.eric.ed.gov/fulltext/ED489533.pdf

*Ceylan, V. K., & Elitok Kesici, A. (2017). Effect of blended learning to academic achievement. Journal of Human Sciences, 14 (1), 308. https://doi.org/10.14687/jhs.v14i1.4141

*Chae, S. E., & Shin, J. H. (2016). Tutoring styles that encourage learner satisfaction, academic engagement, and achievement in an online environment. Interactive Learning Environments, 24(6), 1371–1385. https://doi.org/10.1080/10494820.2015.1009472

*Chiang, T. H. C., Yang, S. J. H., & Hwang, G. J. (2014). An augmented reality-based mobile learning system to improve students’ learning achievements and motivations in natural science inquiry activities. Educational Technology and Society, 17 (4), 352–365. Retrieved on the 11th of September, 2020 from https://www.researchgate.net/profile/Gwo_Jen_Hwang/publication/287529242_An_Augmented_Reality-based_Mobile_Learning_System_to_Improve_Students'_Learning_Achievements_and_Motivations_in_Natural_Science_Inquiry_Activities/links/57198c4808ae30c3f9f2c4ac.pdf

Chiao, H. M., Chen, Y. L., & Huang, W. H. (2018). Examining the usability of an online virtual tour-guiding platform for cultural tourism education. Journal of Hospitality, Leisure, Sport & Tourism Education, 23 (29–38), 1. https://doi.org/10.1016/j.jhlste.2018.05.002

Chizmar, J. F., & Walbert, M. S. (1999). Web-based learning environments guided by principles of good teaching practice. Journal of Economic Education, 30 (3), 248–264. https://doi.org/10.2307/1183061

Cleophas, T. J., & Zwinderman, A. H. (2017). Modern meta-analysis: Review and update of methodologies . Switzerland: Springer. https://doi.org/10.1007/978-3-319-55895-0

Cohen, L., Manion, L., & Morrison, K. (2007). Observation.  Research Methods in Education, 6 , 396–412. Retrieved on the 11th of September, 2020 from https://www.researchgate.net/profile/Nabil_Ashraf2/post/How_to_get_surface_potential_Vs_Voltage_curve_from_CV_and_GV_measurements_of_MOS_capacitor/attachment/5ac6033cb53d2f63c3c405b4/AS%3A612011817844736%401522926396219/download/Very+important_C-V+characterization+Lehigh+University+thesis.pdf

Colis, B., & Moonen, J. (2001). Flexible Learning in a Digital World: Experiences and Expectations. Open & Distance Learning Series . Stylus Publishing.

CoSN. (2020). COVID-19 Response: Preparing to Take School Online. CoSN. (2020). COVID-19 Response: Preparing to Take School Online. Retrieved on the 3rd of September, 2021 from https://www.cosn.org/sites/default/files/COVID-19%20Member%20Exclusive_0.pdf

Cumming, G. (2012). Understanding new statistics: Effect sizes, confidence intervals, and meta-analysis. New York, USA: Routledge. https://doi.org/10.4324/9780203807002

Deeks, J. J., Higgins, J. P. T., & Altman, D. G. (2008). Analysing data and undertaking meta-analyses . In J. P. T. Higgins & S. Green (Eds.), Cochrane handbook for systematic reviews of interventions (pp. 243–296). Sussex: John Wiley & Sons. https://doi.org/10.1002/9780470712184.ch9

Demiralay, R., Bayır, E. A., & Gelibolu, M. F. (2016). Öğrencilerin bireysel yenilikçilik özellikleri ile çevrimiçi öğrenmeye hazır bulunuşlukları ilişkisinin incelenmesi. Eğitim ve Öğretim Araştırmaları Dergisi, 5 (1), 161–168. https://doi.org/10.23891/efdyyu.2017.10

Dinçer, S. (2014). Eğitim bilimlerinde uygulamalı meta-analiz. Pegem Atıf İndeksi, 2014(1), 1–133. https://doi.org/10.14527/pegem.001

*Durak, G., Cankaya, S., Yunkul, E., & Ozturk, G. (2017). The effects of a social learning network on students’ performances and attitudes. European Journal of Education Studies, 3 (3), 312–333. 10.5281/zenodo.292951

*Ercan, O. (2014). Effect of web assisted education supported by six thinking hats on students’ academic achievement in science and technology classes . European Journal of Educational Research, 3 (1), 9–23. https://doi.org/10.12973/eu-jer.3.1.9

Ercan, O., & Bilen, K. (2014). Effect of web assisted education supported by six thinking hats on students’ academic achievement in science and technology classes. European Journal of Educational Research, 3 (1), 9–23.

*Ercan, O., Bilen, K., & Ural, E. (2016). “Earth, sun and moon”: Computer assisted instruction in secondary school science - Achievement and attitudes. Issues in Educational Research, 26 (2), 206–224. https://doi.org/10.12973/eu-jer.3.1.9

Field, A. P. (2003). The problems in using fixed-effects models of meta-analysis on real-world data. Understanding Statistics, 2 (2), 105–124. https://doi.org/10.1207/s15328031us0202_02

Field, A. P., & Gillett, R. (2010). How to do a meta-analysis. British Journal of Mathematical and Statistical Psychology, 63 (3), 665–694. https://doi.org/10.1348/00071010x502733

Geostat. (2019). ‘Share of households with internet access’, National statistics office of Georgia . Retrieved on the 2nd September 2020 from https://www.geostat.ge/en/modules/categories/106/information-and-communication-technologies-usage-in-households

*Gwo-Jen, H., Nien-Ting, T., & Xiao-Ming, W. (2018). Creating interactive e-books through learning by design: The impacts of guided peer-feedback on students’ learning achievements and project outcomes in science courses. Journal of Educational Technology & Society., 21 (1), 25–36. Retrieved on the 2nd of October, 2020 https://ae-uploads.uoregon.edu/ISTE/ISTE2019/PROGRAM_SESSION_MODEL/HANDOUTS/112172923/CreatingInteractiveeBooksthroughLearningbyDesignArticle2018.pdf

Hamdani, A. R., & Priatna, A. (2020). Efektifitas implementasi pembelajaran daring (full online) dimasa pandemi Covid-19 pada jenjang Sekolah Dasar di Kabupaten Subang. Didaktik: Jurnal Ilmiah PGSD STKIP Subang, 6 (1), 1–9.

Hart, C. M., Berger, D., Jacob, B., Loeb, S., & Hill, M. (2019). Online learning, offline outcomes: Online course taking and high school student performance. Aera Open, 5(1).

*Hayes, J., & Stewart, I. (2016). Comparing the effects of derived relational training and computer coding on intellectual potential in school-age children. The British Journal of Educational Psychology, 86 (3), 397–411. https://doi.org/10.1111/bjep.12114

Horton, W. K. (2000). Designing web-based training: How to teach anyone anything anywhere anytime (Vol. 1). Wiley Publishing.

*Hwang, G. J., Wu, P. H., & Chen, C. C. (2012). An online game approach for improving students’ learning performance in web-based problem-solving activities. Computers and Education, 59 (4), 1246–1256. https://doi.org/10.1016/j.compedu.2012.05.009

*Kert, S. B., Köşkeroğlu Büyükimdat, M., Uzun, A., & Çayiroğlu, B. (2017). Comparing active game-playing scores and academic performances of elementary school students. Education 3–13, 45 (5), 532–542. https://doi.org/10.1080/03004279.2016.1140800

*Lai, A. F., & Chen, D. J. (2010). Web-based two-tier diagnostic test and remedial learning experiment. International Journal of Distance Education Technologies, 8 (1), 31–53. https://doi.org/10.4018/jdet.2010010103

*Lai, A. F., Lai, H. Y., Chuang W. H., & Wu, Z.H. (2015). Developing a mobile learning management system for outdoors nature science activities based on 5e learning cycle. Proceedings of the International Conference on e-Learning, ICEL. Proceedings of the International Association for Development of the Information Society (IADIS) International Conference on e-Learning (Las Palmas de Gran Canaria, Spain, July 21–24, 2015). Retrieved on the 14th November 2020 from https://files.eric.ed.gov/fulltext/ED562095.pdf

Lai, C. H., Lin, H. W., Lin, R. M., & Tho, P. D. (2019). Effect of peer interaction among online learning community on learning engagement and achievement. International Journal of Distance Education Technologies (IJDET), 17 (1), 66–77.

Littell, J. H., Corcoran, J., & Pillai, V. (2008). Systematic reviews and meta-analysis . Oxford University.

*Liu, K. P., Tai, S. J. D., & Liu, C. C. (2018). Enhancing language learning through creation: the effect of digital storytelling on student learning motivation and performance in a school English course. Educational Technology Research and Development, 66 (4), 913–935. https://doi.org/10.1007/s11423-018-9592-z

Machtmes, K., & Asher, J. W. (2000). A meta-analysis of the effectiveness of telecourses in distance education. American Journal of Distance Education, 14 (1), 27–46. https://doi.org/10.1080/08923640009527043

Makowski, D., Piraux, F., & Brun, F. (2019). From experimental network to meta-analysis: Methods and applications with R for agronomic and environmental sciences. Dordrecht: Springer. https://doi.org/10.1007/978-94-024_1696-1

* Meyers, C., Molefe, A., & Brandt, C. (2015). The Impact of the" Enhancing Missouri's Instructional Networked Teaching Strategies"(eMINTS) Program on Student Achievement, 21st-Century Skills, and Academic Engagement--Second-Year Results . Society for Research on Educational Effectiveness. Retrieved on the 14 th November, 2020 from https://files.eric.ed.gov/fulltext/ED562508.pdf

OECD. (2020). ‘A framework to guide an education response to the COVID-19 Pandemic of 2020 ’. https://doi.org/10.26524/royal.37.6

Pecoraro, V. (2018). Appraising evidence . In G. Biondi-Zoccai (Ed.), Diagnostic meta-analysis: A useful tool for clinical decision-making (pp. 99–114). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-78966-8_9

Pigott, T. (2012). Advances in meta-analysis . Springer.

Pillay, H. , Irving, K., & Tones, M. (2007). Validation of the diagnostic tool for assessing Tertiary students’ readiness for online learning. Higher Education Research & Development, 26 (2), 217–234. https://doi.org/10.1080/07294360701310821

Prestiadi, D., Zulkarnain, W., & Sumarsono, R. B. (2019). Visionary leadership in total quality management: efforts to improve the quality of education in the industrial revolution 4.0. In the 4th International Conference on Education and Management (COEMA 2019). Atlantis Press

Poole, D. M. (2000). Student participation in a discussion-oriented online course: a case study. Journal of Research on Computing in Education, 33 (2), 162–177. https://doi.org/10.1080/08886504.2000.10782307

Rahayu, F. S., Budiyanto, D., & Palyama, D. (2017). Analisis penerimaan e-learning menggunakan technology acceptance model (Tam)(Studi Kasus: Universitas Atma Jaya Yogyakarta). Jurnal Terapan Teknologi Informasi, 1 (2), 87–98.

Rasmussen, R. C. (2003). The quantity and quality of human interaction in a synchronous blended learning environment . Brigham Young University Press.

*Ravenel, J., T. Lambeth, D., & Spires, B. (2014). Effects of computer-based programs on mathematical achievement scores for fourth-grade students. i-manager’s Journal on School Educational Technology, 10 (1), 8–21. https://doi.org/10.26634/jsch.10.1.2830

Rolisca, R. U. C., & Achadiyah, B. N. (2014). Pengembangan media evaluasi pembelajaran dalam bentuk online berbasis e-learning menggunakan software wondershare quiz creator dalam mata pelajaran akuntansi SMA Brawijaya Smart School (BSS). Jurnal Pendidikan Akuntansi Indonesia, 12(2).

Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effective- ness of Web-based and classroom instruction: A meta-analysis . Personnel Psychology, 59 (3), 623–664. https://doi.org/10.1111/j.1744-6570.2006.00049.x

Stewart, D. W., & Kamins, M. A. (2001). Developing a coding scheme and coding study reports. In M. W. Lipsey & D. B. Wilson (Eds.), Practical meta­analysis: Applied social research methods series (Vol. 49, pp. 73–90). Sage.

Swan, K. (2007). Research on online learning. Journal of Asynchronous Learning Networks, 11 (1), 55–59.

*Sung, H. Y., Hwang, G. J., & Chang, Y. C. (2016). Development of a mobile learning system based on a collaborative problem-posing strategy. Interactive Learning Environments, 24 (3), 456–471. https://doi.org/10.1080/10494820.2013.867889

Tsagris, M., & Fragkos, K. C. (2018). Meta-analyses of clinical trials versus diagnostic test accuracy studies. In G. Biondi-Zoccai (Ed.), Diagnostic meta-analysis: A useful tool for clinical decision-making (pp. 31–42). Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-78966-8_4

UNESCO. (2020, Match 13). COVID-19 educational disruption and response. Retrieved on the 14 th November 2020 from https://en.unesco.org/themes/education-emergencies/ coronavirus-school-closures

Usta, E. (2011a). The effect of web-based learning environments on attitudes of students regarding computer and internet. Procedia-Social and Behavioral Sciences, 28 (262–269), 1. https://doi.org/10.1016/j.sbspro.2011.11.051

Usta, E. (2011b). The examination of online self-regulated learning skills in web-based learning environments in terms of different variables. Turkish Online Journal of Educational Technology-TOJET, 10 (3), 278–286. Retrieved on the 14th November 2020 from https://files.eric.ed.gov/fulltext/EJ944994.pdf

Vrasidas, C. & MsIsaac, M. S. (2000). Principles of pedagogy and evaluation for web-based learning. Educational Media International, 37 (2), 105–111. https://doi.org/10.1080/095239800410405

*Wang, C. H., & Chen, C. P. (2013). Effects of facebook tutoring on learning english as a second language. Proceedings of the International Conference e-Learning 2013, (2009), 135–142. Retrieved on the 15th November 2020 from https://files.eric.ed.gov/fulltext/ED562299.pdf

Wei, H. C., & Chou, C. (2020). Online learning performance and satisfaction: Do perceptions and readiness matter? Distance Education, 41 (1), 48–69.

*Yu, F. Y. (2019). The learning potential of online student-constructed tests with citing peer-generated questions. Interactive Learning Environments, 27 (2), 226–241. https://doi.org/10.1080/10494820.2018.1458040

*Yu, F. Y., & Chen, Y. J. (2014). Effects of student-generated questions as the source of online drill-and-practice activities on learning . British Journal of Educational Technology, 45 (2), 316–329. https://doi.org/10.1111/bjet.12036

*Yu, F. Y., & Pan, K. J. (2014). The effects of student question-generation with online prompts on learning. Educational Technology and Society, 17 (3), 267–279. Retrieved on the 15th November 2020 from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.565.643&rep=rep1&type=pdf

*Yu, W. F., She, H. C., & Lee, Y. M. (2010). The effects of web-based/non-web-based problem-solving instruction and high/low achievement on students’ problem-solving ability and biology achievement. Innovations in Education and Teaching International, 47 (2), 187–199. https://doi.org/10.1080/14703291003718927

Zhao, Y., Lei, J., Yan, B, Lai, C., & Tan, S. (2005). A practical analysis of research on the effectiveness of distance education. Teachers College Record, 107 (8). https://doi.org/10.1111/j.1467-9620.2005.00544.x

*Zhong, B., Wang, Q., Chen, J., & Li, Y. (2017). Investigating the period of switching roles in pair programming in a primary school. Educational Technology and Society, 20 (3), 220–233. Retrieved on the 15th November 2020 from https://repository.nie.edu.sg/bitstream/10497/18946/1/ETS-20-3-220.pdf

Download references

Author information

Authors and affiliations.

Primary Education, Ministry of Turkish National Education, Mersin, Turkey

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Hakan Ulum .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Ulum, H. The effects of online education on academic success: A meta-analysis study. Educ Inf Technol 27 , 429–450 (2022). https://doi.org/10.1007/s10639-021-10740-8

Download citation

Received : 06 December 2020

Accepted : 30 August 2021

Published : 06 September 2021

Issue Date : January 2022

DOI : https://doi.org/10.1007/s10639-021-10740-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online education
  • Student achievement
  • Academic success
  • Meta-analysis
  • Find a journal
  • Publish with us
  • Track your research
  • Research article
  • Open access
  • Published: 11 June 2020

Towards Understanding Online Question & Answer Interactions and their effects on student performance in large-scale STEM classes

  • David H Smith IV 1 ,
  • Qiang Hao 1 ,
  • Vanessa Dennen 2 ,
  • Michail Tsikerdekis 1 ,
  • Bradly Barnes 3 ,
  • Lilu Martin 1 &
  • Nathan Tresham 1  

International Journal of Educational Technology in Higher Education volume  17 , Article number:  20 ( 2020 ) Cite this article

11 Citations

6 Altmetric

Metrics details

Online question & answer (Q & A) is a distinctive type of online interaction that is impactful on student learning. Prior studies on online interaction in large-scale classes mainly focused on online discussion and were conducted mainly in non-STEM fields. This research aims to quantify the effects of online Q & A interactions on student performance in the context of STEM education. 218 computer science students from a large university in the southeastern United States participated in this research. Data of four online Q & A activities was mined from the online Q & A forum for the course, including three student activities (asking questions, answering questions and viewing questions/answers) and one instructor activity (answering questions/providing clarifications). These activities were found to have different effects on student performance. Viewing questions/answers was found to have the greatest effect, while interaction with instructors showed minimum effects. This research fills the gap of lacking research in online Q & A, and the results of this research can inform the effective usage of online Q & A in large-scale STEM courses.

Introduction

Higher education instructors, regardless of teaching modality and fields, have looked to online communication tools as a means of interacting with and assisting their students. These online communication tools support the primary form of student engagement in online classes, and typically fulfill a secondary or supportive role in classes with face-to-face meetings. These tools may be used to support teacher-directed activities (e.g., required topical discussion of course content), or student-initiated ones. In particular, students may find online communication tools useful when they are confused or need support to complete their course assignments. These student-driven help and information-seeking activities, in which questions and answers are exchanged between human partners, may rely on peer or instructor responses to student queries (Puustinen & Rouet, 2009 ). Although students might also seek help and ask questions during a face-to-face class session or instructor office hours, or alternatively by email, asynchronous online communication tools offer some distinct advantages in this scenario (Hwang, Singh, & Argote, 2015 ; Ryan, Pintrich, & Midgley, 2001 ). They can increase student access to help in situations where class sizes are large and opportunities for face-to-face interaction are limited. They allow students to ask questions when questions arise and receive crowdsourced answers from their peers and instructors (Reznitskaya & Wilkinson, 2017 ; Xia, Wang, & Hu, 2009 ). The generated questions and responses on asynchronous online communication tools can be archived, indexed and viewed as often as is needed, and all students in the class can benefit from the answers to a question asked by a single student.

Questions can be asked and answers can be provided in any online discussion forums, but special question and answer (Q & A) platforms have been developed to facilitate this specific form of interaction. These Q & A forums are designed to focus on help-seeking interactions. Rather than writing more general posts, as is done on discussion forums, users on Q & A forums are prompted directly to ask or answer questions. This approach is more effective than a general discussion forum for Q & A interactions (Hao, Barnes, Branch, & Wright, 2017 a). Each question begins its own thread, and the appropriate form of response is an answer. Q & A forums vary in their sophistication; for instance, some platforms allow users to upvote popular responses, allow users to prompt question-askers to rephrase asked questions, or associate user reputational information with questions and answers. On the Internet at large, social question and answer (SQA) platforms like Stack Exchange ( www.stackexchange.com ), Yahoo! Answers ( answers.yahoo.com ) and Quora ( www.quora.com ) allow individuals to seek answers to their questions from the general population. SQA platforms crowdsource answers, in contrast to Virtual Reference (VR) services, where experts are contacted to provide answers (Shah & Kitzie, 2012 ). In a class setting, peer responses are most similar to crowdsourced responses on SQA platforms, varying in depth and accuracy. In contrast, instructor responses might be best equaled by VR services that value quality, and are more likely than SQA to provide a thorough answer including an explanation of how the answer was found or derived.

The value of online Q & A with regard to learning outcomes remains uncertain. Although the relationship between more general online discussion and student performance has been heavily studied (e.g., Beaudoin, 2002 ; Davies & Graff, 2005 ; Palmer, Holt, & Bray, 2008 ; Russo & Koesten, 2005 ), Q & A forums have been studied far less, and been typically researched in SQA settings rather than class contexts (e.g., Ji & Cui, 2016 ; Jin, Li, Zhong, & Zhai, 2015 ; Shah & Kitzie, 2012 ; Zhang, Liu, Chen, & Gong, 2017 ). Class-based online Q & A activities differ from SQA activities in some meaningful ways, such as user motivation and interactions (Zhang et al., 2017 ). For instance, users in an SQA community must form and maintain their own social networks in order to sustain the online Q & A community (Jin et al., 2015 ; Nasehi, Sillito, Maurer, & Burns, 2012 ). In a class context, students, as primary contributors to and consumers of the Q & A, have instructor-defined reasons to engage in asking, answering, and viewing questions. Rather than being rewarded solely through attention and community reputation, students might be rewarded by engaging in activities and gaining knowledge that ideally will help support their academic learning and performance (Deng, Liu, & Qi, 2011 ).

The aforementioned general class discussion studies consistently show both that student behavior patterns vary along with student performance in class, and that online discussion is beneficial to student learning (Beaudoin, 2002 ; Davies & Graff, 2005 ; Palmer et al., 2008 ; Russo & Koesten, 2005 ). This study expands these prior findings by examining different student behaviors and their relationships to student performance within a class-based online Q & A context. The study is situated in a large-scale introductory programming course. Online Q & A may be particularly useful in this and other STEM disciplines where the class size tends to be large and students have substantial needs for Q & A interaction (Henderson & Dancy, 2011 ).

Related studies

Online question & answer, help seeking, information seeking, and online discussion.

Longstanding definitional problems have muddied the understanding of several important concepts when it comes to online interaction among students, such as help-seeking, information seeking, and online discussion. Help-seeking and information seeking have been studied extensively over the last fifty years mainly in face-to-face contexts, and their boundaries become blurry in online contexts (Puustinen & Rouet, 2009 ). Help-seeking refers to seeking social assistance from other people, while information seeking refers to seeking information typically from books or machines (Verhoeven, Heerwegh, & De Wit, 2010 ). The literature on both education and computer-human interaction in the last two decades has suggested the convergence of seeking social assistance and information (Hao, Barnes, Branch, & Wright, 2017 a; Puustinen & Rouet, 2009 ; Stahl & Bromme, 2009 ). More help-seeking involves the support from machines, such as search engines or tutor systems, and more information-seeking serves the same purpose as help-seeking (Deng et al., 2011 ). It is worth noting that both help-seeking and information-seeking emphasize the person who seeks help or information, but not the interactions among people or machines. Online discussion, in educational contexts, refers to course-related conversations among students on discussion boards or forums (Hammond, 2005 ). In contrast to help-seeking and information seeking, online discussion emphasizes interactions among students (An, Shin, & Lim, 2009 ). Online discussion has found extensive applications in social or liberal education over the last two decades.

Online Q & A, in educational contexts, refers to interactions involving asking and answering learning questions among students and instructors (Yoo & Kim, 2014 ). Online Q & A shares some similarities with help-seeking, information seeking, and online discussion, but also bears some notable differences. Online Q & A approaches are similar to online discussion because both emphasize interaction among people (Hao, Barnes, Wright, & Branch, 2017 b; Jain, Chen, & Parkes, 2009 ). However, online Q & A is comprised of three distinct interactive behaviors: asking learning questions, answering questions from others, and viewing questions and answers from others. Online Q & A interactions tend to be situated among problem-solving learning activities. For this reason, they are particularly useful in courses where students work through problem sets or are engaged in other solution-oriented work, such as in programming courses (Hao, Wright, Barnes, & Branch, 2016 ; Jain et al., 2009 ; Parnell et al., 2011 ).

Online Question & Answer Interaction and academic success

Many studies have explored the relationship between online discussion and student performance in the last two decades. Such studies contributed to the understanding of the effective usage of online discussions in educational settings. Beaudoin ( 2012 ) found that education majors who were active in online discussion tended to have slightly better performance, although their counterparts who were inactive in online discussion still spend time on other course tasks. Beaudoin’s findings were confirmed by Davies and Graff ( 2005 ), who also noted a significant difference between students who were active and those who were inactive in online discussion. Russo and Koesten ( 2005 ) noted in communication education that students who led the discussion and those who only followed up tend to exhibit different learning styles, motivation and performance through social network analysis. The findings on the relationship between online discussion and student performance are largely consistent. However, it is worth noting that the majority of these studies were conducted in the context of liberal and social education.

Online discussion has not been as widely adopted in STEM fields as it is in the humanities and social sciences, where discursive interactions and the exploration of multiple perspectives are more highly valued. When online discussion is employed in STEM classes, participation has been found to be flat and focused on minimal, deadline-oriented participation (Palmer et al., 2008 ). The value of this type of participation to learning is questionable (de Jong, Sotiriou, & Gillet, 2014 ; Yu, Liu, & Chan, 2005 ). In contrast, Q & A forums have bigger application potential in education in STEM fields, given the typical large class sizes and the needs for solution-oriented communication.

Despite the potential of Q & A forums in STEM fields, there is a lack of research attention on quantifying the effects of online Q & A interactions on student performance. Nurminen, Heino, and Ihantola ( 2017 ) surveyed computer science students and found that they were most likely to seek help online from previous social connections and more experienced peers. In this sense, students are likely to mine their personal networks for formal learning purposes in much the same way as other people do for more general Q & A needs (Jin et al., 2015 ). Consistent with such findings, Hao et al. ( 2016 ) found that computer science students tended to ask questions online to their peers more frequently than to instructors, a trend that became more clear with experienced students. Mazzolini and Maddison ( 2003 ) investigated the roles of instructors in online Q & A interactions among astronomy students, and found that the active instructors inadvertently discourage students from interacting with peers for Q & A online, but are welcomed by students. Such studies confirmed that students in STEM fields did interact with each other online for Q & A purposes, but did not measure the effects of the interactions on student performance. To our best knowledge, Yoo and Kim’s ( 2014 ) study is the only one that focuses on the relationship between Q & A interaction and student performance. Yoo and Kim ( 2014 ) studied a computer science course and found that the frequency of student answers to questions and the earlier students communicated their problems correlated positively with their project performance. Essentially, students who posted more in the Q & A also performed better on the course project. However, the extent to which this finding can be generalized remains uncertain, and a causal relationship should not be assumed from the correlation. Additional studies are needed to help confirm these findings and explore how specific types of Q & A activities relate to student performance.

Social network analysis in Online Question & Answer Interactions

Social network analysis (SNA) refers to the process of investigating social structures through the use of networks and graph theory (Borgatti, Mehra, Brass, & Labianca, 2009 ). SNA is a good fit for analyzing student online Q & A interactions. SNA can capture the dynamics of online interactions, and provide a more granular investigation than many conventional metrics, such as posting and viewing frequency (Hernández-García, González-González, Jiménez-Zarco, & Chaparro-Peláez, 2015 ).

Despite the lack of application of SNA in online Q & A interactions, SNA has been applied to understand the online discussion of students. Such research may contribute to understanding online Q & A interactions from the perspective of SNA. For instance, many of these studies demonstrated that a strong sense of community can lead to a strong social network, which has a higher rate of information flow and interpersonal support (De Laat, 2002 ; Shen, Nuankhieo, Huang, Amelung, & Laffey, 2008 ). A strong sense of community can be evidenced by a set of factors such as student-instructor ratio, instructor immediateness, and student performance.

For instance, instructor immediateness contributes to a stable social network of online discussion. Instructor immediateness is a social network factor, and it refers to the extent to which instructors can provide facilitation to online interaction. As the student-instructor ratio increases, the more their ability to facilitate interactions among students is constrained. However, if an instructor becomes dominant in the social network, student-student interactions might be discouraged (Rovai, 2007 ). For another instance, student performance can be an important factor in determining the constructs of an online discussion network, such as network density and stability (Dawson, 2010 ). The higher students’ academic performance is, the more likely their network is to have higher density and centrality scores. Ghadirian and his colleagues (Ghadirian, Salehi, & Ayub, 2018 ) corroborated the findings of Dawson and suggested that low-performing students are less willing to engage in a discussion given their innate lack of knowledge and, as a result, confidence. This condition is only worsened given their lack of interactions with high performing and well-connected members of the network who may aid them in expanding their knowledge base.

Research design

Research design and questions.

This study used a quasi-experimental research design to investigate the relationship among the activity level of students and instructors using an online Q & A forum, and student performance on course assessments. Specifically, the research questions addressed by this study are:

RQ1: How do students interact with their peers and instructors in online Q & A forums?

RQ2: To what extent do online student Q & A interactions predict their academic performance?

RQ3: To what extent does the instructor facilitation in online Q & A forums predict student academic performance?

Participants and contexts

This study was conducted in a face-to-face introductory programming course offered at a large research university in the southeastern United States. To strengthen the stability and generalizability of our findings, we collected data from the same course in both the summer and fall semesters. A total of 218 participants participated in this study, consisting of 54 students in the summer semester and 164 students in the fall semester. During both semesters the course had the same instructor, syllabus, curriculum, exams, and grading rubrics. However, in the fall semester the course duration was 50% longer than in summer, and covered the course material in a more in-depth approach.

In both semesters, an online Q & A forum, Piazza ( piazza.com ), was set up for students to ask learning questions to their peers, TAs and the instructor. As a Q & A forum, Pizza has been adopted widely in large-scale computer science and engineering courses (Hao, Galyardt, Barnes, Branch, & Wright, 2018 ). In our study, students were encouraged but not required to ask questions to their peers and instructor using Piazza. It is worth noting that student online Q & A interactions on Piazza were periodically pushed to each individual student of the same course through automated emails.

Data collection and analysis

To answer the three proposed research questions, we needed to collect student online Q & A interaction data. To achieve this, we programmed a script that mines Q & A data from Piazza. We mined the Q & A data of the two courses that were studied in this research. From the mined data, we were able to extrapolate the following online Q & A interaction behaviors and frequency data:

Who asked which questions

Who answered which questions

Frequency of asking questions per person

Frequency of answering questions per person

Frequency of viewing questions and answers from other people per person

Frequency of posting notes and clarifications per instructor/TA

The frequency of posting notes and clarifications only applies to the instructor and teaching assistants (TAs). The other frequency data applies to the instructor, TAs, and students.

To answer the RQ1, we applied SNA to the mined online Q & A interactions. Specifically, directed network graphs were constructed per semester. The following network factors were considered:

Density: The ratio between all connections in a network and the number of possible connections.

Transitivity: The extent to which interconnected transitive triads exist in a graph.

Closeness Centrality: This represents the average distance from one node in the network to another.

Betweenness Centrality: This is a method of indicating the influence of a node in a network by the number of vertices in a given vertexes path of communication.

Eigenvector Centrality: Defines the importance of a vertex by the degree to which it is connected with other well-connected vertices.

To answer RQ2 and RQ3, we collected student performance data on both programming assignments and paper-based exams. The reason for using two different measurements is the dual focuses of programming courses, including student programming capability and conceptual knowledge on programming languages and problem-solving. Programming assignments focus on measuring student programming capability, while paper-based exams focus on measuring student conceptual knowledge. Both programming assignments and exams were graded by the course instructor with the help of two teaching assistants. Once data matching between online Q & A data and performance data occurred, all individual identifiers were cleaned from the dataset. Based on the collected data, we used blockwise regression to explore the extent to which student, instructor, and TAs’ online Q & A interactions can predict student course performance.

How do students interact with their peers and instructors in online Q & a forums (RQ1)?

To answer RQ1, we used descriptive analysis and SNA to investigate how students interact with their peers, TAs, and instructors in terms of asking and answering questions. The behavior “viewing questions” was excluded from SNA given the lack of knowledge in who viewed which questions or answers. The descriptive analysis results are presented in Table  1 . As student numbers increased, they tended to ask and answer more questions on average. As questions and answer numbers went up, the average view also increased substantially. Overall, the frequency data showed that students successfully formed online Q & A communities regardless of student numbers in class.

SNA uncovered a consistent theme across the two semesters: online Q & A interactions are student-driven. The evidence for this theme is that when instructors and TAs are removed from the social networks, the networks still showed strong centralization (see Fig.  1 ). Although instructors and teaching assistants were the most central nodes in these networks (by several metrics), their removal did not alter substantially local statistics for nodes as well as overall network statistics (e.g., density and transitivity). To quantify our observation from the visualization, we calculated all network factors and compared between with and without instructors/TAs. The results are presented in Table  2 . The quantitative comparisons confirmed our observation from the visualization, and yielded consistent findings across the two semesters. This finding also indicated that the instructors and TAs were not attempting to take dominant roles in the online Q & A interactions.

figure 1

Change of centrality between with and without instructor/TAs

To further understand the characteristics of students who contributed more to asking and answering questions, we separated students of the same class into three groups by their overall academic performance (total of both programming assignments and exams): high, medium and low performing students. The summary of their online interaction frequencies is presented in Table  3 . The only consistent finding across the two semesters is that high-performing students with higher overall academic performance tended to view more questions and answers than low-performing students [summer: t = 3.30, p  < 0.025 Footnote 1 ; fall: t = 2.87, p  < 0.005 2 ].

To what extent do student activity and instructor facilitation in online Q & a forums predict student performance (RQ2 and RQ3)?

To answer RQ2 and RQ3, we applied blockwise regressions to understand the extent to which online Q & A interaction activities of students, TAs, and instructors predict student performance. Given the nature of the programming courses, there were two metrics of student performance: programming assignments and paper exams. To avoid inflating Type II error, we applied Bonferroni Correction to the significance levels. The adjusted significance levels were:

* p  < 0.025; ** p  < 0.005; *** p  < 0.0005.

We separated all predictors of online Q & A interactions into two blocks:

Student online Q & A behaviors:

○ Ask: ask questions to peers, TAs, and instructors online

○ Answer: answer questions asked by peers online

○ View: view questions, answers, notes, or announcements online

Instructor/TA online Q & A behaviors:

○ Facilitate: Answer questions, post announcements, or post notes

When only the block of student online Q & A behaviors were used as predictors, we found that “view” as a significant predictor of student performance in programming assignments. This finding was consistent across the two semesters. The results are presented in Table  4 . Overall, asking, answering, and viewing questions/answers explained 16.2% variance of programming assignment performance in the summer semester, and 11.6% variance in the fall semester.

In contrast, asking, answering, and viewing questions/answers explained 10.0% variance of exam performance in the summer semester, and 3.6% variance in the fall semester. However, neither the overall results nor any single factors were found significant in predicting student exam performance.

When the block of instructor/TA online Q & A behaviors were added to the regression models of predicting student programming assignment performance, no significant improvements were found for the summer semester [F(1,17) = 0.148, p  > 0.0.25] or the fall semester [F(1, 58) = 0.674, p > 0.025]. However, it is worth noting that when the Instructor/TA behaviors were controlled, student behavior “view” was still a significant predictor of student programming assignment performance for the summer semester. A similar effect was not found for the fall semester. The results are presented in Table  5 .

When the block of instructor/TA online Q & A behaviors were added to the regression models of predicting student exam performance, no significant improvements were found for the summer semester [F(1, 17) = 0.455, p > 0.025] or the fall semester [F(1, 58) = 0.003, p  > 0.025]. Different from the models of predicting student programming assignment performance, no single factors were found consistently significant in predicting student exam performance. When the Instructor/TA behaviors were controlled, student behavior “view” was found as a significant predictor of student exam performance in the summer semester. However, this finding was not replicated in the fall semester.

We systematically investigated online student interactions in large-scale face-to-face programming courses over two semesters in this study. Of all the findings in this study, we would like to highlight and discuss three findings that were consistent across the two experiments of this study.

First, low-performing students should be identified from early assessments and given more guidance/facilitation on effective usage of online Q & A. Although we investigated the predictive power of online Q & A interaction on student performance, it is important to note that our results do not imply causality between online Q & A and academic achievement. Despite the lack of causality, we found that high-performing students tended to use online Q & A forums more actively than their counterparts. This finding shares similarities with the prior studies on both online Q & A and online discussion. Hao, et al. (Hao, Barnes, Wright, & Branch, 2017 b) reached similar findings through a survey study on online Q & A in the context of computing education. Palmer et al. ( 2008 ), as well as Davies and Graff ( 2005 ), found that students who were active in online course discussions tended to outperform their counterparts in terms of the overall course grade. Compared with prior studies, this study took a further step by understanding fine-grained differences among three types of student online interactions. It is worth noting that a significant difference was only found in viewing questions/answers among students with different academic performance. In other words, high-performing students did not necessarily ask or answer more questions, but they viewed significantly more questions and answers from both peers and instructors for learning purposes. In large-scale classes, low-performing students can be identified from an early assessment. If these students are given guidance on using online Q & A (e.g., read questions and answers from peers), they may gradually learn how to utilize online Q & A effectively for learning purposes, which may lead to overall student performance improvement. To understand why students of different performance behave as such, future studies may consider an explorative qualitative study on student usage of Q & A in STEM classes.

Second, effective online Q & A may not require instructors to actively answer student questions. This finding was indicated by multiple evidence accumulated when answering the three proposed research questions. The facilitation of the instructor was found insignificant in predicting student performance. More importantly, when we analyzed the social networks formed by online Q & A interactions, we noticed that the social networks remained robust and healthy when instructor/TAs were removed by all network metrics. This finding is consistent with prior studies on online discussion. For instance, high levels of instructor intervention in online discussion has been found to stifle peer interaction (Dennen, 2005 ) and shorten discussion or stop further conversation (Knutas, Ikonen, & Porras, 2013 ; Mazzolini & Maddison, 2003 ). In the context of online Q & A situated in large-scale face-to-face classes, the instructors’ early intervention might remove the barrier to problem-solving immediately, but may also eliminate potential cooperative problem solving and knowledge construction among students (Hao et al., 2016 ; Mazzolini & Maddison, 2003 ). Given the heavy teaching loads of large-scale classes, it might be more beneficial for instructors to intervene less in online Q & A among student peers unless a question stays unanswered for some time. However, it is worth noting that the generalizability of this finding may have limits. This study is situated in a large-scale face-to-face programming course. Introductory programming courses are typically featured by a large number of students and intensive hands-on problem-solving tasks. Although online Q & A plays a significant role for students to ask questions and have their questions answered, it is by no means the only approach. Students can meet the instructors and TAs in person to ask questions. In a purely online or blended learning environment, the role of instructors in online Q & A interactions may change significantly (Bliuc, Ellis, Goodyear, & Piggott, 2011 ; Khine & Lourdusamy, 2003 ). For instance, the instructor facilitation of online Q & A interactions in a purely online learning environment may become more important than the face-to-face classes (Lynch, 2010 ).

Third, effective usage and engagement in online Q & A are not necessarily observable. Viewing is not deemed as active participation behavior in online discussion (Davies & Graff, 2005 ; Hao et al., 2016 ). Typically, students who only engage in viewing discussion content are described as “lurkers” (Rovai, 2007 ). Lurkers have been found to have negative impacts on the development of community networks in online discussion forums (Nonnecke & Preece, 2000 ; Rovai, 2007 ). When there is a higher percentage of lurkers, the online community may risk collapsing or never be successfully formed. However, we found that student viewing questions/answers to be positively significant in predicting their programming assignment performance. This finding was consistent across the two semesters and with or without controlling the facilitation from instructors/TAs. On the one hand, it is surprising that a seemingly passive behavior could have any significant impacts on student performance. On the other hand, given the focus of the course is problem-solving, it is understandable that active discussion is not necessary as long as a learning question is clearly answered. When a learning question is clearly answered, all students can all benefit from reading it online. This is different from the courses in social studies or liberal studies, where the intensity of online discussion and debates is positively correlated with student performance (e.g., Hara, Bonk, & Angeli, 2000 ; Knowlton, 2002 ). Viewing questions was found to have limited predictive power on student exam performance, but students also asked significantly fewer questions on the exam than programming assignments. Given the nature of paper-based closed-book exams, it is less likely students would rely heavily on online Q & A for their studying and preparation. In contrast, when students encounter difficulties or challenges while working on their programming assignments, online Q & A can promote just-in-time learning. Given the learning benefits of viewing questions/answers, instructors may consider different information delivery approaches that best expose such information to students, such as periodically pushing information to students through emails (Warren & Meads, 2014 ).

Limitations

There are three major limitations of this study. First, this study was conducted in a single context, computing education. The extent to which our findings can be generalized to other fields, especially non-STEM fields, needs further investigation. Second, this study was conducted in only one higher education institute, which may limit the generalizability of the findings. To mitigate this limitation, a replication study was designed and conducted during the study period of time, and the findings were consistent across the two studies. To further verify the generalizability of our findings, future studies may consider replicating our studies across multiple higher education institutes. Third, the demographic information of the participants was not collected in this study. Although demographic factors, such as race or gender, were found less important than student performance in terms of how student behave in online discussion, the lack of such data prevents further investigation into how students of different gender, race, or age interact with each other in online Q & A. Future studies may consider collecting such data through surveys and provide a more fine-grained analysis on the relationship between student demographics and online Q & A interaction.

Online Q & A forums have found wide application in large-scale STEM courses during the last few years, but few studies have investigated the effects of online Q & A interactions on student learning. This study sought to fill this gap by quantifying the effect of fine-grained online Q & A interaction behaviors on student academic performance. The findings of this study provide evidence to support that online Q & A interactions share similarities with online discussion, but also bear significant differences. Viewing questions/answers is typically deemed as a passive behavior that does not indicate active participation in online discussion. However, this study demonstrated that such passive behavior contributed to stronger student academic performance in Q & A interactions. Given the significance of viewing questions/answers in predicting student assignment performance, instructors are recommended to push such information to “the face of students” through pushing functions, which may further amplify its positive effects.

Availability of data and materials

The anonymized datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

, 2 corrected significance level; see page 10

An, H., Shin, S., & Lim, K. (2009). The effects of different instructor facilitation approaches on students’ interactions during asynchronous online discussions. Computers & Education , 53 (3), 749–760.

Article   Google Scholar  

Beaudoin, M. F. (2002). Learning or lurking? Tracking the “invisible” online student. The Internet and Higher Education , 5 (2), 147–155.

Beaudoin, B. B. (2012). Creating Community: From Individual Reflection to SoTL Transformation. International Journal for the Scholarship of Teaching & Learning , 6 (1).

Bliuc, A. M., Ellis, R. A., Goodyear, P., & Piggott, L. (2011). A blended learning approach to teaching foreign policy: Student experiences of learning through face-to-face and online discussion and their relationship to academic performance. Computers & Education , 56 (3), 856–864.

Borgatti, S. P., Mehra, A., Brass, D. J., & Labianca, G. (2009). Network analysis in the social sciences. Science , 323 (5916), 892–895.

Davies, J., & Graff, M. (2005). Performance in e-learning: Online participation and student grades. British Journal of Educational Technology , 36 (4), 657–663.

Dawson, S. (2010). ‘Seeing’the learning community: An exploration of the development of a resource for monitoring online student networking. British Journal of Educational Technology , 41 (5), 736–752.

de Jong, T., Sotiriou, S., & Gillet, D. (2014). Innovations in STEM education: The go-lab federation of online labs. Smart Learning Environments , 1 (1), 3.

De Laat, M. (2002, January). Network and content analysis in an online community discourse. In Proceedings of the conference on computer support for collaborative learning: Foundations for a CSCL community (pp. 625-626). International Society of the Learning Sciences.

Deng, S., Liu, Y., & Qi, Y. (2011). An empirical study on determinants of web-based question-answer services adoption. Online Information Review , 35 (5), 789–798.

Dennen, V. P. (2005). From message posting to learning dialogues: Factors affecting learner participation in online discussion. Distance Education , 26 (1), 125–146.

Ghadirian, H., Salehi, K., & Ayub, A. F. M. (2018). Analyzing the social networks of high-and low-performing students in online discussion forums. American Journal of Distance Education , 32 (1), 27–42.

Hammond, M. (2005). A review of recent papers on online discussion in teaching and learning in higher education. Journal of Asynchronous Learning Networks , 9 (3), 9–23.

Google Scholar  

Hao, Q., Barnes, B., Branch, R. M., & Wright, E. (2017). Predicting computer science students’ online help-seeking tendencies. Knowledge Management & E-Learning: An International Journal , 9 (1), 19–32.

Hao, Q., Barnes, B., Wright, E., & Branch, R. M. (2017). The influence of achievement goals on online help seeking of computer science students. British Journal of Educational Technology , 48 (6), 1273–1283.

Hao, Q., Galyardt, A., Barnes, B., Branch, R. M., & Wright, E. (2018, October). Automatic identification of ineffective online student questions in computing education. In proceedings of 2018 IEEE Frontiers in Education Conference (FIE) (pp. 1-5). IEEE.

Hao, Q., Wright, E., Barnes, B., & Branch, R. M. (2016). What are the most important predictors of computer science students' online help-seeking behaviors? Computers in Human Behavior , 62 , 467–474.

Hara, N., Bonk, C. J., & Angeli, C. (2000). Content analysis of online discussion in an applied educational psychology course. Instructional Science , 28 (2), 115–152.

Henderson, C., & Dancy, M. H. (2011). Increasing the impact and diffusion of STEM education innovations. In Invited paper for the National Academy of Engineering, Center for the Advancement of Engineering Education Forum, Impact and Diffusion of Transformative Engineering Education Innovations, available at: http://www.Nae.Edu/file.Aspx .

Hernández-García, Á., González-González, I., Jiménez-Zarco, A. I., & Chaparro-Peláez, J. (2015). Applying social learning analytics to message boards in online distance learning: A case study. Computers in Human Behavior , 47 , 68–80.

Hwang, E. H., Singh, P. V., & Argote, L. (2015). Knowledge sharing in online communities: Learning to cross geographic and hierarchical boundaries. Organization Science , 26 (6), 1593–1611.

Jain, S., Chen, Y., & Parkes, D. C. (2009). Designing incentives for online question and answer forums. In Proceedings of the 10th ACM conference on Electronic Commerce (pp. 129-138). ACM.

Ji, Q., & Cui, D. (2016). The enjoyment of social Q & a websites usage: A multiple mediators model. Bulletin of Science, Technology & Society , 36 (2), 98–106.

Jin, J., Li, Y., Zhong, X., & Zhai, L. (2015). Why users contribute knowledge to online communities: An empirical study of an online social Q & a community. Information & Management , 52 (7), 840–849.

Khine, M. S., & Lourdusamy, A. (2003). Blended learning approach in teacher education: Combining face-to-face instruction, multimedia viewing and online discussion. British Journal of Educational Technology , 34 (5), 671–675.

Knowlton, D. S. (2002). Promoting liberal arts thinking through online discussion: A practical application and its theoretical basis. Educational Technology & Society , 5 (3), 189–194.

Knutas, A., Ikonen, J., & Porras, J. (2013). Communication patterns in collaborative software engineering courses: A case for computer-supported collaboration. In Proceedings of the 13th Koli Calling International Conference on Computing Education Research (pp. 169-177). ACM.

Lynch, D. J. (2010). Application of online discussion and cooperative learning strategies to online and blended college courses. College Student Journal , 44 (3).

Mazzolini, M., & Maddison, S. (2003). Sage, guide or ghost? The effect of instructor intervention on student participation in online discussion forums. Computers & Education , 40 (3), 237–253.

Nasehi, S. M., Sillito, J., Maurer, F., & Burns, C. (2012, September). What makes a good code example?: A study of programming Q & a in StackOverflow. In Software Maintenance (ICSM), 2012 28th IEEE International Conference on (pp. 25-34). IEEE.

Nonnecke, B., & Preece, J. (2000, April). Lurker demographics: Counting the silent. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (pp. 73-80). ACM.

Nurminen, M., Heino, P., & Ihantola, P. (2017). Friends and gurus: Do students ask for help from those they know or those who would know. In Proceedings of the 17th Koli Calling Conference on Computing Education Research (pp. 80-87). ACM.

Palmer, S., Holt, D., & Bray, S. (2008). Does the discussion help? The impact of a formally assessed online discussion on final student results. British Journal of Educational Technology , 39 (5), 847–858.

Parnell, L. D., Lindenbaum, P., Shameer, K., Dall'Olio, G. M., Swan, D. C., Jensen, L. J., … Albert, I. (2011). BioStar: An online question & answer resource for the bioinformatics community. PLoS Computational Biology , 7 (10), e1002216.

Puustinen, M., & Rouet, J. F. (2009). Learning with new technologies: Help seeking and information searching revisited. Computers & Education , 53 (4), 1014–1019.

Reznitskaya, A., & Wilkinson, I. A. (2017). Truth matters: Teaching young students to search for the most reasonable answer. Phi Delta Kappan , 99 (4), 33–38.

Rovai, A. P. (2007). Facilitating online discussions effectively. The Internet and Higher Education , 10 (1), 77–88.

Russo, T. C., & Koesten, J. (2005). Prestige, centrality, and learning: A social network analysis of an online class. Communication Education , 54 (3), 254–261.

Ryan, A. M., Pintrich, P. R., & Midgley, C. (2001). Avoiding seeking help in the classroom: Who and why? Educational Psychology Review , 13 (2), 93–114.

Shah, C., & Kitzie, V. (2012). Social Q & a and virtual reference-comparing apples and oranges with the help of experts and users. Journal of the American Society for Information Science and Technology , 63 (10), 2020–2036.

Shen, D., Nuankhieo, P., Huang, X., Amelung, C., & Laffey, J. (2008). Using social network analysis to understand sense of community in an online learning environment. Journal of Educational Computing Research , 39 (1), 17–36.

Stahl, E., & Bromme, R. (2009). Not everybody needs help to seek help. Surprising effects of metacognitive instructions to foster help-seeking in an online-learning environment. Computers & Education , 53 , 1020–1028.

Verhoeven, J. C., Heerwegh, D., & De Wit, K. (2010). Information and communication technologies in the life of university freshmen: An analysis of change. Computers & Education , 55 (1), 53–66.

Warren, I., & Meads, A. (2014). Push notification mechanisms for pervasive smartphone applications. IEEE Pervasive Computing , 13 (2), 61–71.

Xia, H., Wang, R., & Hu, S. (2009). Social networks analysis of the knowledge diffusion among university students. In Knowledge Acquisition and Modeling, 2009. KAM'09. Second International Symposium on (Vol. 2, pp. 343-346). IEEE.

Yoo, J., & Kim, J. (2014). Can online discussion participation predict group project performance? Investigating the roles of linguistic features and participation patterns. International Journal of Artificial Intelligence in Education , 24 (1), 8–32.

Yu, F. Y., Liu, Y. H., & Chan, T. W. (2005). A web-based learning system for question-posing and peer assessment. Innovations in Education and Teaching International , 42 (4), 337–348.

Zhang, X., Liu, S., Chen, X., & Gong, Y. (2017). Social capital, motivations, and knowledge sharing intention in health Q & a communities. Management Decision , 55 (7), 1536–1557.

Download references

Acknowledgements

Not applicable.

Author information

Authors and affiliations.

Western Washington University, 516 High Street, Bellingham, WA, 98225, USA

David H Smith IV, Qiang Hao, Michail Tsikerdekis, Lilu Martin & Nathan Tresham

Florida State University, Tallahassee, USA

Vanessa Dennen

University of Georgia, Athens, USA

Bradly Barnes

You can also search for this author in PubMed   Google Scholar

Contributions

QH led the project. DS analyzed data, interpreted the results, and drafted the 1st version of the paper. VD contributed to the literature review, MT contributed to the social network analysis, and BB contributed to the data collection. LM and NT contributed to the revision of the paper. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Qiang Hao .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Smith IV, D., Hao, Q., Dennen, V. et al. Towards Understanding Online Question & Answer Interactions and their effects on student performance in large-scale STEM classes. Int J Educ Technol High Educ 17 , 20 (2020). https://doi.org/10.1186/s41239-020-00200-7

Download citation

Received : 21 October 2019

Accepted : 31 March 2020

Published : 11 June 2020

DOI : https://doi.org/10.1186/s41239-020-00200-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Online help-seeking
  • Online question & answer
  • Large-scale classes
  • STEM education
  • Computing education
  • Social network analysis

research questions online learning

logo

FAQs: How Online Courses Work

research questions online learning

The Benefits of Online Education

How online education works.

  • The Effectiveness of Online Education

Choosing Online Degree Programs

Technical skills and considerations, paying for online degree programs.

Recent reports detail just how quickly colleges adopted online learning. According to the Babson Survey Research Group, university and student participation in online education is at an all-time high. Even some of the largest and most prestigious universities now offer online degrees. Despite its growing popularity, online education is still relatively new, and many students and academics are completely unacquainted with it. Questions and concerns are normal. This page addresses some of the most frequently asked questions about online degree programs. All answers are thoroughly researched; we include links to relevant studies whenever possible.

Question: What are some of the advantages of attending college online?

[Answer] Online education is known for its flexibility, but studies have identified several additional benefits of attending class online. Among them:

  • Communication : Many students are more comfortable engaging in meaningful discussions online than in a classroom. These students might have hearing or speech impairments; speak different languages; have severe social anxiety; or simply need more time to organize their thoughts.
  • Personalized learning : Not all students learn the same way. Web-based learning allows instructors to deliver the same content using different media, like videos or simulations, personalizing learning. Online classes providing round-the-clock access to materials and lectures also let students study when they feel most focused and engaged.
  • Accessibility : Online programs transcend time, geographic, and other barriers to higher education. This can be helpful for those who work full-time, live in remote regions, or serve in the military.
  • Adaptability : Learning management systems that integrate text-to-speech and other adaptive technologies support learners with physical, behavioral, and learning challenges.
  • Efficiency : Studies show online students tend to achieve the same learning results in half the time as classroom-based students.
  • Engagement : Online instructors can use games, social media, virtual badges, and other engaging technologies to motivate students and enhance learning.

Question: How does online education work on a day-to-day basis?

[Answer] Instructional methods, course requirements, and learning technologies can vary significantly from one online program to the next, but the vast bulk of them use a learning management system (LMS) to deliver lectures and materials, monitor student progress, assess comprehension, and accept student work. LMS providers design these platforms to accommodate a multitude of instructor needs and preferences. While some courses deliver live lectures using video conferencing tools, others allow students to download pre-recorded lectures and use message boards to discuss topics. Instructors may also incorporate simulations, games, and other engagement-boosters to enhance learning. Students should research individual programs to find out how and when they would report to class; how lectures and materials are delivered; how and how much they would collaborate with faculty and peers; and other important details. We address many of these instructional methods and LMS capabilities elsewhere in this guide.

Question: Can you really earn online degrees in hands-on fields like nursing and engineering?

[Answer] Yes and no. While schools do offer online and hybrid programs in these disciplines, students must usually meet additional face-to-face training requirements. Schools usually establish these requirements with convenience in mind. For example, students in fields like nursing, teaching, and social work may be required to complete supervised fieldwork or clinical placements, but do so through local schools, hospitals/clinics, and other organizations. For example, students enrolled in the University of Virginia’s Engineers PRODUCED in Virginia program can complete all their engineering classes online in a live format while gaining practical experience through strategic internships with employers across the state. Some online programs do require students to complete on-campus training, seminars and assessments, but visits are often designed to minimize cost and travel. Students should consider these requirements when researching programs.

The Effectiveness and Credibility of Online Education

Question: is online education as effective as face-to-face instruction.

[Answer] Online education may seem relatively new, but years of research suggests it can be just as effective as traditional coursework, and often more so. According to a U.S. Department of Education analysis of more than 1,000 learning studies, online students tend to outperform classroom-based students across most disciplines and demographics. Another major review published the same year found that online students had the advantage 70 percent of the time, a gap authors projected would only widen as programs and technologies evolve.

While these reports list several plausible reasons students might learn more effectively online—that they have more control over their studies, or more opportunities for reflection—medium is only one of many factors that influence outcomes. Successful online students tend to be organized self-starters who can complete their work without reporting to a traditional classroom. Learning styles and preferences matter, too. Prospective students should research programs carefully to identify which ones offer the best chance of success.

Question: Do employers accept online degrees?

[Answer] All new learning innovations are met with some degree of scrutiny, but skepticism subsides as methods become more mainstream. Such is the case for online learning. Studies indicate employers who are familiar with online degrees tend to view them more favorably, and more employers are acquainted with them than ever before. The majority of colleges now offer online degrees, including most public, not-for-profit, and Ivy League universities. Online learning is also increasingly prevalent in the workplace as more companies invest in web-based employee training and development programs.

Question: Is online education more conducive to cheating?

[Answer] The concern that online students cheat more than traditional students is perhaps misplaced. When researchers at Marshall University conducted a study to measure the prevalence of cheating in online and classroom-based courses, they concluded, “somewhat surprisingly, the results showed higher rates of academic dishonesty in live courses.” The authors suggest the social familiarity of students in a classroom setting may lessen their sense of moral obligation.

Another reason cheating is less common in online programs is that colleges have adopted strict anti-cheating protocols and technologies. According to a report published by the Online Learning Consortium, some online courses require students to report to proctored testing facilities to complete exams, though virtual proctoring using shared screens and webcams is increasingly popular. Sophisticated identity verification tools like biometric analysis and facial recognition software are another way these schools combat cheating. Instructors often implement their own anti-cheating measures, too, like running research papers through plagiarism-detection programs or incorporating challenge-based questions in quizzes and exams. When combined, these measures can reduce academic dishonesty significantly.

In an interview with OnlineEducation.com, Dr. Susan Aldridge, president of Drexel University Online, discussed the overall approach many universities take to curbing cheating–an approach that includes both technical and policy-based prevention strategies.

“Like most online higher education providers, Drexel University employs a three-pronged approach to maintaining academic integrity among its virtual students,” said Dr. Aldridge. “We create solid barriers to cheating, while also making every effort to identify and sanction it as it occurs or directly after the fact. At the same time, we foster a principled community of inquiry that, in turn, motivates students to act in ethical ways. So with this triad in mind, we have implemented more than a few strategies and systems to ensure academic integrity.”

Question: How do I know if online education is right for me?

[Answer] Choosing the right degree program takes time and careful research no matter how one intends to study. Learning styles, goals, and programs always vary, but students considering online colleges must consider technical skills, ability to self-motivate, and other factors specific to the medium. A number of colleges and universities have developed assessments to help prospective students determine whether they are prepared for online learning. You can access a compilation of assessments from many different colleges online. Online course demos and trials can also be helpful, particularly if they are offered by schools of interest. Students can call online colleges and ask to speak an admissions representative who can clarify additional requirements and expectations.

Question: How do I know if an online degree program is credible?

[Answer] As with traditional colleges, some online schools are considered more credible than others. Reputation, post-graduation employment statistics, and enrollment numbers are not always reliable indicators of quality, which is why many experts advise students to look for accredited schools. In order for an online college to be accredited, a third-party organization must review its practices, finances, instructors, and other important criteria and certify that they meet certain quality standards. The certifying organization matters, too, since accreditation is only as reliable as the agency that grants it. Students should confirm online programs’ accrediting agencies are recognized by the U.S. Department of Education and/or the Council on Higher Education Accreditation before submitting their applications.

Online Student Support Services

Question: do online schools offer the same student support services as traditional colleges.

[Answer] Colleges and universities tend to offer online students many of the same support services as campus-based students, though they may be administered differently. Instead of going to a campus library, online students may log in to virtual libraries stocked with digital materials, or work with research librarians by phone or email. Tutoring, academic advising, and career services might rely on video conferencing software, virtual meeting rooms, and other collaborative technologies. Some online colleges offer non-academic student support services as well. For example, Western Governor University’s Student Assistance Program provides online students with 24/7 access to personal counseling, legal advice, and financial consulting services. A list of student support services is usually readily available on online colleges’ websites.

Question: What technical skills do online students need?

[Answer] Online learning platforms are typically designed to be as user-friendly as possible: intuitive controls, clear instructions, and tutorials guide students through new tasks. However, students still need basic computer skills to access and navigate these programs. These skills include: using a keyboard and a mouse; running computer programs; using the Internet; sending and receiving email; using word processing programs; and using forums and other collaborative tools. Most online programs publish such requirements on their websites. If not, an admissions adviser can help.

Students who do not meet a program’s basic technical skills requirements are not without recourse. Online colleges frequently offer classes and simulations that help students establish computer literacy before beginning their studies. Microsoft’s online digital literacy curriculum is one free resource.

Question: What technology requirements must online students meet? What if they do not meet them?

[Answer] Technical requirements vary from one online degree program to the next, but most students need at minimum high-speed Internet access, a keyboard, and a computer capable of running specified online learning software. Courses using identity verification tools and voice- or web-conferencing software require webcams and microphones. Scanners and printers help, too. While online schools increasingly offer mobile apps for learning on-the-go, smartphones and tablets alone may not be sufficient.

Most online colleges list minimum technology requirements on their websites. Students who do not meet these requirements should contact schools directly to inquire about programs that can help. Some online schools lend or provide laptops, netbooks, or tablets for little to no cost, though students must generally return them right away if they withdraw from courses. Other colleges may offer grants and scholarships to help cover technical costs for students who qualify.

Question: Are online students eligible for financial aid?

[Answer] Qualifying online students enrolled in online degree programs are eligible for many of the same loans, scholarships, and grants as traditional campus-based students. They are also free to apply for federal and state financial aid so long as they:

  • Attend online programs accredited by an organization recognized by either the U.S. Department of Education or the Council on Higher Education Accreditation.
  • Attend online schools that are authorized to operate in their state of residence.
  • Meet all additional application requirements, including those related to legal status, citizenship, age, and educational attainment.
  • Submit applications and all supporting materials by their deadlines.

Students can visit the U.S. Department of Education’s Federal Student Aid website to review all eligibility requirements and deadlines, and to submit their Free Application for Student Aid (FAFSA). Note that many states, colleges, and organizations use FAFSA to determine students’ eligibility for other types of aid, including grants, scholarships, and loans. Students can contact prospective schools directly to speak with financial aid advisors.

Disclaimer: Financial aid is never guaranteed, even among eligible online students. Contact colleges and universities directly to clarify their policies

Question: Can students use military education benefits to pay for online education?

[Answer] Active-duty and veteran military service-members can typically apply their military education benefits toward an online degree, though they must still meet many of the same eligibility requirements detailed in the previous answer. Many state-level benefits have additional residency requirements. Most colleges have whole offices dedicated to helping these students understand and use their benefits effectively. They may also clarify applicable aid programs and requirements on their official websites. When in doubt, students should contact schools directly or report to the nearest Department of Veteran Affairs to learn more about their options.

" Educational Benefits of Online Learning ," Blackboard Learning, Presented by California Polytechnic State University, San Louis Obispo

" Four Proven Advantages of Online Learning (That are NOT the Cost, Accessibility or Flexibility) , Coursera Blog, Coursera

" Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies ," U.S. Department of Education

" Twenty years of research on the academic performance differences between traditional and distance learning ," M. Sachar, Y. Neumann, Journal of Online Learning and Teaching, Merlot

" The Market Value of Online Degrees as a Credible Credential ," Calvin D. Foggle, Devonda Elliott, accessed via New York University

" Cheating in the Digital Age: Do Students Cheat More in Online Courses ?" George Watson, James Sottile, accessed via the University of Georga

" Student Identity Verification Tools and Live Proctoring in Accordance With Regulations to Combat Academic Dishonesty in Distance Education ," Vincent Termini, Franklin Hayes, Online Learning Consortium

" Student Readiness for Online Learning ," G. Hanley, Merlot

" Recognized Accrediting Organizations ," Council for Higher Education Accreditation  

" Digital Literacy ," Microsoft, Inc.  

" Free Application for Federal Student Aid ," Office of Federal Student Aid, U.S. Department of Education

Online Education Guide

  • Expert Advice for Online Students
  • Instructional Design in Online Programs
  • Learning Management Systems
  • Online Student Trends and Success Factors
  • Online Teaching Methods
  • Student Guide to Understanding and Avoiding Plagiarism
  • Student Services for Online Learners
  • Twin Cities

University of Minnesota

  • Bachelor's Degrees
  • Master's Degrees
  • Doctorate Degrees
  • Certificates
  • Coursera Online Courses
  • Licensing Programs
  • Post-Secondary Enrollment Options (PSEO)
  • Credit Online Courses
  • Professional Development Online Courses
  • Student Stories
  • Health and Well-being
  • Learn Online

Top 6 Questions People Ask About Online Learning

Closeup of hands typing on a laptop and holding a pen

Since the invention of the internet, we have witnessed a huge change in the accessibility and flexibility of higher education. Not only can students earn their degrees at a distance and on their own schedule but they can also complete certifications and trade programs with more ease than ever before.

If you’re considering online classes as a means to achieving your goals, you likely have questions. Here are some of the most common ones, with answers!

What Is Online Learning?

So, just what is online learning? This term refers to education that takes place in a completely virtual environment using an internet connection and a computer or device to connect to the school. In the online "classroom," you can do all the same things that in-person students do, such as:

  • Listening to lectures
  • Answering questions from a professor
  • Completing readings
  • Turning in assignments
  • Taking quizzes and tests
  • Meeting as a group

Some schools, programs, or courses combine online learning with in-person learning experiences. This model is known as "hybrid education," wherein students participate online most of the time. However, when learning objectives call for hands-on experience (say, practicing skills for a health profession or laboratory experiments), they can head to campus.

That said, many programs allow their students to complete the entire curriculum virtually. Degrees such as a Bachelor of Science in Software Engineering, for example, may not call for in-person learning at all. You can always contact admissions or the specific department if you want to learn more about delivery format.

Why Online Learning Is Good for Students

Despite the widespread accessibility of remote education, some students remain skeptical about online classes. Are you really learning if there’s not a professor present at the front of a lecture hall? Can you really learn the skills you need without the in-person interaction between students and faculty?

Ease and Accessibility

While some people feel online education lacks the intimacy and immediacy of a "real" classroom, it offers an educational channel to students who might otherwise not have the time or resources to attend. Online access has made it possible for students to enroll and participate in online classes with greater ease, from nearly anywhere, in a way that fits their schedules.

Affordability

Online courses are usually more affordable as well. According to the Education Data Initiative , an online degree is $36,595 cheaper than an in-person degree when the cost of tuition and attendance are compared. The average cost of attending a private university is $129,800 for an in-person degree and only $60,593 for an online degree.

It’s also estimated that students who commute to college for in-person classes pay $1,360 per year in transportation costs that an online student wouldn’t have to pay. Add in factors such as cheaper meals at home and more time to work, and it’s not hard to see why many students opt for online learning.

Top Questions About Online Learning

Despite the benefits, you likely still have some questions about online learning. Let’s take a look at six of the most common.

1. Are You Able to Earn Your Degree Completely Online? Yes, many (but not all) schools do offer this as an option. We’re not just talking about certificates or minors, either.

For instance, you can earn a Master of Science in Electrical and Computer Engineering from U of M Online. If you complete the entire program virtually, you will pay in-state tuition costs from anywhere in the United States – a major bonus. A good school should offer you a searchable course catalog to compare options and view which have a required on-campus component.

2. How Long Does It Take to Earn a Degree Online? Most online programs mirror their in-person counterparts in terms of how long it takes to earn the degree. From certificates and minors to bachelor’s or master’s degrees, you’re looking at roughly the same timeline for equivalent programs. Some programs offer students the flexibility for part time options if that is needed to accommodate work and family responsibilities.

Some schools or programs may limit how quickly you can move through the material. However, given the freedom and flexibility of online learning, it’s possible you can complete more coursework in less time than you could on campus. Talk to your admissions officer or program coordinator about specifics.

When first researching your options, you can again turn to the searchable course catalog. On each degree page, you should find the recommended timeline clearly listed.

3. Is an Online Degree Viewed Differently Than a Traditional Degree? Among the most common and pressing questions for online learning is whether future employers view online degrees with skepticism. The answer is an emphatic "no." Most online programs appear on your transcript the same as on-campus programs would.

You may also wonder if an online program will impact your plans for a higher degree later. As long as your degree is from an accredited institution, it won’t harm your chances of acceptance.

4. What Are Some Benefits of Online Learning? When you choose to learn online, you can:

  • Study more, due to the lack of commuting to, from, and around campus
  • Potentially take more classes, again because of the time savings
  • Get more immediate feedback from professors on assignments
  • Leverage the online resources that come with your course portal
  • Spend less money on your degree overall
  • Continue working or caring for family while going to school

5. Do Instructors Offer Help and Support to Students? Instructors are required to give the same amount of time and energy to their online classes as they do to in-person groups. In fact, many professors are enthusiastic about virtual learning because it means they have more flexibility and don’t have to commute either.

6. Can Students Have Success and Excel in Online Learning? Lastly, can you learn new skills, attain knowledge, and become successful in online learning? Unequivocally, the answer is yes! Online degree programs still afford you tutoring and career resources as well as full access to academic resources such as the library .

Plus, you will have the ability to transfer credits either to or from the degree program, just as you would with an on-campus one. In other words, you will find yourself and your goals in no way hampered by taking the online approach.

Online Learning

In summary, online learning offers you a ton of freedom and savings. It allows you to complete your work anywhere, from the office to the living room to on the road. And you can rest assured that you’ll get the same level of professorial support as you would from an on-campus program, as well as a degree that’s worth just as much.

Learn More, Today

Ready to learn more? Reach out to U of M Online to ask questions or get information about specific programs today!

  • Cost of Online Education vs. Traditional Education
  • The top 5 questions people ask about online learning
  • https://online.umn.edu/programs-search
  • https://online.umn.edu/tuition-fees-and-financial-aid
  • https://online.umn.edu/story/academic-tutoring-and-career-resources
  • https://online.umn.edu/story/u-m-libraries
  • https://online.umn.edu/transfer-credit
  • https://online.umn.edu/

Shaping the Future of Online Learning

Published may 22, 2024.

If you’ve been enrolled in any educational course or postsecondary educational program since 2020, chances are you’ve witnessed the rise in online learning firsthand .

The COVID-19 global pandemic shuttered storefronts, theaters, and classrooms alike, causing major disruptions in how goods and services were delivered. As consumers adopted Instacart for their grocery needs and streamed new blockbuster movies from the comfort of their living rooms, students needed an innovative way to bring their classes home. A year into the pandemic over 60% of all undergraduate students were enrolled in at least one online course , with 28% exclusively enrolled in online courses, according to the National Center for Education Statistics.

There are other reasons for the widespread adoption, including accessibility. Rural and international students who may be far removed from traditional educational institutions can now attend Harvard classes anywhere there’s an internet connection. Or, consider working adults seeking to progress or switch careers. Life doesn’t stop for a class, and attending one in-person can be prohibitive. While still challenging, logging into a virtual classroom is far more manageable. Online education is for everyone.

Technological and pedagogical developments have helped online learning progress beyond the days of discussion boards and essay uploads. Now, students can enjoy a multimedia educational experience that is rooted in the latest research, all while participating in the community of their “virtual campus”.

If you’re one of the millions of learners who have experienced online education, you might be interested to learn where it’s going next. At Harvard Online, the question, “what is the future of online learning?” guides an ongoing conversation that drives us everyday.

In this blog, we sat down with Catherine Breen , Managing Director of Harvard Online. With more than two decades of senior executive leadership at Harvard University and oversight of Harvard Online, Breen has an invaluable perspective on the future of online learning, and the exciting role Harvard Online is playing in bringing the future into the present. 

Photo of Catherine Breen in a meeting at a conference table.

Catherine Breen, Managing Director of Harvard Online, in a team meeting.

Harvard Online (HO): How has the online learning landscape evolved in recent years? 

Catherine Breen (CB): At the beginning of the COVID-19 lockdown, there was a massive escalation in demand for online learning. Demand began to recede slowly as the months wore on and by late 2022, it started to level out. But we observed two big changes: Internally, the demand for Harvard Online content was still almost three times higher than pre-pandemic. Externally, in reaction to the demand surge, there was significant and rapid growth of new online course offerings and companies that purveyed varying types of digital products.    

HO: What is shaping the future of online learning today? 

CB: Because of the rapid and massive shift to online that occurred around the globe in the spring of 2020, the landscape changed permanently. There are many things shaping the future but here are just a few that I can see from my perspective:

  • Increased adoption of online learning across all ages and levels of education: Everyone expanded their online course catalogs; new companies and offerings sprung up everywhere.
  • Greater tech investment across organizations and industries: Organizations are investing more time, money, and effort into technology infrastructure, tools, and platforms to support online learning and participants in these courses.
  • New pedagogical methods to bridge the gap between traditional and novel learning methods: Instructors have adapted their teaching methods for online, hybrid, and blended environments.
  • Enhanced accessibility to quality education and learning experiences: Efforts have been made to improve access for students of all types, abilities, geographies, and backgrounds so that everyone can participate effectively.    

HO: What are the remaining challenges that online learning faces? 

CB: While these changes have improved the online learning experience, challenges remain, including addressing the digital divide, maximizing student engagement, and refining the quality of online courses.

The pandemic accelerated the adoption of online learning and its impact will likely continue to shape higher education for many years to come.  

HO: How does online learning contribute to Harvard's mission of promoting accessibility and inclusion in education, especially for learners who may not have traditional access to higher education?

CB: Online learning levels the playing field for learners in many ways.

Most students think that a Harvard-quality education is out of reach, for a variety of reasons. With online courses, however, learners from around the country and the world can take courses with Harvard instructors at their own pace at a more affordable price point.

Our online courses also typically incorporate a range of multimedia elements, allowing students with different learning styles to flourish. We also ensure that our online learning experiences are accessible to all learners, including those with disabilities. This commitment to inclusivity aligns with the broader goals of promoting equitable access to education.

Lastly, our online courses often include discussion forums and virtual communities where learners can connect and collaborate. This allows for interactions among students from diverse backgrounds and experiences, fostering a sense of belonging and inclusion.  

It’s clear that online learning has a lot to offer everyone, and it’s only getting better. In our next blog in this series, we’ll hear more from Cathy on how institutions can implement online learning modalities effectively. 

If you missed the first blog in this series detailing the future of online learning, you can check out the first blog here . To learn more about Harvard Online, explore our fully online course catalog here .

Related Blog Posts

Uniref brings harvard courses on web programming to syrian refugees.

Harvard Online is proud to provide access to education and experiences that help communities thrive.

Harvard Online in Your Workplace: Elevate Your Team's Professional Development

At Harvard Online we understand the value of an educated and skilled workforce.

A Decade of Innovation: Online Learning at Harvard

We are always asking, “What does the future look like for teaching and learning?”

Examples

Research Question

Ai generator.

research questions online learning

A research question serves as the foundation of any academic study, driving the investigation and framing the scope of inquiry. It focuses the research efforts, ensuring that the study addresses pertinent issues systematically. Crafting a strong research question is essential as it directs the methodology, data collection, and analysis, ultimately shaping the study’s conclusions and contributions to the field.

What is a Research Question?

A research question is the central query that guides a study, focusing on a specific problem or issue. It defines the purpose and direction of the research, influencing the methodology and analysis. A well-crafted research question ensures the study remains relevant, systematic, and contributes valuable insights to the field.

Types of Research Questions

Research questions are a crucial part of any research project. They guide the direction and focus of the study. Here are the main types of research questions:

1. Descriptive Research Questions

These questions aim to describe the characteristics or functions of a specific phenomenon or group. They often begin with “what,” “who,” “where,” “when,” or “how.”

  • What are the common symptoms of depression in teenagers?

2. Comparative Research Questions

These questions compare two or more groups or variables to identify differences or similarities.

  • How do the academic performances of students in private schools compare to those in public schools?

3. Correlational Research Questions

These questions seek to identify the relationships between two or more variables. They often use terms like “relationship,” “association,” or “correlation.”

  • Is there a relationship between social media usage and self-esteem among adolescents?

4. Causal Research Questions

These questions aim to determine whether one variable causes or influences another. They are often used in experimental research.

  • Does a new teaching method improve student engagement in the classroom?

5. Exploratory Research Questions

These questions are used when the researcher is exploring a new area or seeking to understand a complex phenomenon. They are often open-ended.

  • What factors contribute to the success of start-up companies in the tech industry?

6. Predictive Research Questions

These questions aim to predict future occurrences based on current or past data. They often use terms like “predict,” “forecast,” or “expect.”

  • Can high school GPA predict college success?

7. Evaluative Research Questions

These questions assess the effectiveness or impact of a program, intervention, or policy .

  • How effective is the new community outreach program in reducing homelessness?

8. Ethnographic Research Questions

These questions are used in qualitative research to understand cultural phenomena from the perspective of the participants.

  • How do cultural beliefs influence healthcare practices in rural communities?

9. Case Study Research Questions

These questions focus on an in-depth analysis of a specific case, event, or instance.

  • What were the critical factors that led to the failure of Company X?

10. Phenomenological Research Questions

These questions explore the lived experiences of individuals to understand a particular phenomenon.

  • What is the experience of living with chronic pain?

Research Question Format

A well-formulated research question is essential for guiding your study effectively. Follow this format to ensure clarity and precision:

  • Begin with a broad subject area.
  • Example: “Education technology”
  • Define a specific aspect or variable.
  • Example: “Impact of digital tools”
  • Decide if you are describing, comparing, or investigating relationships.
  • Example: “Effectiveness”
  • Identify who or what is being studied.
  • Example: “High school students”
  • Formulate the complete question.
  • Example: “How effective are digital tools in enhancing the learning experience of high school students?”
Sample Format: “How [specific aspect] affects [target population] in [context]?” Example: “How does the use of digital tools affect the academic performance of high school students in urban areas?”

Research Question Examples

Research questions in business.

  • “What are the primary factors influencing customer loyalty in the retail industry?”
  • “How does employee satisfaction differ between remote work and in-office work environments in tech companies?”
  • “What is the relationship between social media marketing and brand awareness among small businesses?”
  • “How does implementing a four-day workweek impact productivity in consulting firms?”
  • “What are the emerging trends in consumer behavior post-COVID-19 in the e-commerce sector?”
  • “Why do some startups succeed in attracting venture capital while others do not?”
  • “How effective is corporate social responsibility in enhancing brand reputation for multinational companies?”
  • “How do decision-making processes in family-owned businesses differ from those in publicly traded companies?”
  • “What strategies do successful entrepreneurs use to scale their businesses in competitive markets?”
  • “How does supply chain management affect the operational efficiency of manufacturing firms?”

Research Questions in Education

  • “What are the most common challenges faced by first-year teachers in urban schools?”
  • “How do student achievement levels differ between traditional classrooms and blended learning environments?”
  • “What is the relationship between parental involvement and student academic performance in elementary schools?”
  • “How does the implementation of project-based learning affect critical thinking skills in middle school students?”
  • “What are the emerging trends in the use of artificial intelligence in education?”
  • “Why do some students perform better in standardized tests than others despite similar instructional methods?”
  • “How effective is the flipped classroom model in improving student engagement and learning outcomes in high school science classes?”
  • “How do teachers’ professional development programs impact teaching practices and student outcomes in rural schools?”
  • “What strategies can be employed to reduce the dropout rate among high school students in low-income areas?”
  • “How does classroom size affect the quality of teaching and learning in elementary schools?”

Research Questions in Health Care

  • “What are the most common barriers to accessing mental health services in rural areas?”
  • “How does patient satisfaction differ between telemedicine and in-person consultations in primary care?”
  • “What is the relationship between diet and the incidence of type 2 diabetes in adults?”
  • “How does regular physical activity influence the recovery rate of patients with cardiovascular diseases?”
  • “What are the emerging trends in the use of wearable technology for health monitoring?”
  • “Why do some patients adhere to their medication regimen while others do not despite similar health conditions?”
  • “How effective are community-based health interventions in reducing obesity rates among children?”
  • “How do interdisciplinary team meetings impact patient care in hospitals?”
  • “What strategies can be implemented to reduce the spread of infectious diseases in healthcare settings?”
  • “How does nurse staffing level affect patient outcomes in intensive care units?”

Research Questions in Computer Science

  • “What are the key features of successful machine learning algorithms used in natural language processing?”
  • “How does the performance of quantum computing compare to classical computing in solving complex optimization problems?”
  • “What is the relationship between software development methodologies and project success rates in large enterprises?”
  • “How does the implementation of cybersecurity protocols impact the frequency of data breaches in financial institutions?”
  • “What are the emerging trends in blockchain technology applications beyond cryptocurrency?”
  • “Why do certain neural network architectures outperform others in image recognition tasks?”
  • “How effective are different code review practices in reducing bugs in open-source software projects?”
  • “How do agile development practices influence team productivity and product quality in software startups?”
  • “What strategies can improve the scalability of distributed systems in cloud computing environments?”
  • “How does the choice of programming language affect the performance and maintainability of enterprise-level software applications?”

Research Questions in Psychology

  • “What are the most common symptoms of anxiety disorders among adolescents?”
  • “How does the level of job satisfaction differ between remote workers and in-office workers?”
  • “What is the relationship between social media use and self-esteem in teenagers?”
  • “How does cognitive-behavioral therapy (CBT) affect the severity of depression symptoms in adults?”
  • “What are the emerging trends in the treatment of post-traumatic stress disorder (PTSD)?”
  • “Why do some individuals develop resilience in the face of adversity while others do not?”
  • “How effective are mindfulness-based interventions in reducing stress levels among college students?”
  • “How does group therapy influence the social skills development of children with autism spectrum disorder?”
  • “What strategies can improve the early diagnosis of bipolar disorder in young adults?”
  • “How do sleep patterns affect cognitive functioning and academic performance in high school students?”

More Research Question Examples

Research question examples for students.

  • “What are the primary study habits of high-achieving college students?”
  • “How do academic performances differ between students who participate in extracurricular activities and those who do not?”
  • “What is the relationship between time management skills and academic success in high school students?”
  • “How does the use of technology in the classroom affect students’ engagement and learning outcomes?”
  • “What are the emerging trends in online learning platforms for high school students?”
  • “Why do some students excel in standardized tests while others struggle despite similar study efforts?”
  • “How effective are peer tutoring programs in improving students’ understanding of complex subjects?”
  • “How do different teaching methods impact the learning process of students with learning disabilities?”
  • “What strategies can help reduce test anxiety among middle school students?”
  • “How does participation in group projects affect the development of collaboration skills in university students?”

Research Question Examples for College Students

  • “What are the most common stressors faced by college students during final exams?”
  • “How does academic performance differ between students who live on campus and those who commute?”
  • “What is the relationship between part-time employment and GPA among college students?”
  • “How does participation in study abroad programs impact cultural awareness and academic performance?”
  • “What are the emerging trends in college students’ use of social media for academic purposes?”
  • “Why do some college students engage in academic dishonesty despite awareness of the consequences?”
  • “How effective are university mental health services in addressing students’ mental health issues?”
  • “How do different learning styles affect the academic success of college students in online courses?”
  • “What strategies can be employed to improve retention rates among first-year college students?”
  • “How does participation in extracurricular activities influence leadership skills development in college students?”

Research Question Examples in Statistics

  • “What are the most common statistical methods used in medical research?”
  • “How does the accuracy of machine learning models compare to traditional statistical methods in predicting housing prices?”
  • “What is the relationship between sample size and the power of a statistical test in clinical trials?”
  • “How does the use of random sampling affect the validity of survey results in social science research?”
  • “What are the emerging trends in the application of Bayesian statistics in data science?”
  • “Why do some datasets require transformation before applying linear regression models?”
  • “How effective are bootstrapping techniques in estimating the confidence intervals of small sample data?”
  • “How do different imputation methods impact the results of analyses with missing data?”
  • “What strategies can improve the interpretation of interaction effects in multiple regression analysis?”
  • “How does the choice of statistical software affect the efficiency of data analysis in academic research?”

Research Question Examples in Socialogy

  • “What are the primary social factors contributing to urban poverty in major cities?”
  • “How does the level of social integration differ between immigrants and native-born citizens in urban areas?”
  • “What is the relationship between educational attainment and social mobility in different socioeconomic classes?”
  • “How does exposure to social media influence political participation among young adults?”
  • “What are the emerging trends in family structures and their impact on child development?”
  • “Why do certain communities exhibit higher levels of civic engagement than others?”
  • “How effective are community policing strategies in reducing crime rates in diverse neighborhoods?”
  • “How do socialization processes differ in single-parent households compared to two-parent households?”
  • “What strategies can be implemented to reduce racial disparities in higher education enrollment?”
  • “How does the implementation of public housing policies affect the quality of life for low-income families?”

Research Question Examples in Biology

  • “What are the primary characteristics of the various stages of mitosis in eukaryotic cells?”
  • “How do the reproductive strategies of amphibians compare to those of reptiles?”
  • “What is the relationship between genetic diversity and the resilience of plant species to climate change?”
  • “How does the presence of pollutants in freshwater ecosystems impact the growth and development of aquatic organisms?”
  • “What are the emerging trends in the use of CRISPR technology for gene editing in agricultural crops?”
  • “Why do certain bacteria develop antibiotic resistance more rapidly than others?”
  • “How effective are different conservation strategies in protecting endangered species?”
  • “How do various environmental factors influence the process of photosynthesis in marine algae?”
  • “What strategies can enhance the effectiveness of reforestation programs in tropical rainforests?”
  • “How does the method of seed dispersal affect the spatial distribution and genetic diversity of plant populations?”

Research Question Examples in History

  • “What were the key social and economic factors that led to the Industrial Revolution in Britain?”
  • “How did the political systems of ancient Athens and ancient Sparta differ in terms of governance and citizen participation?”
  • “What is the relationship between the Renaissance and the subsequent scientific revolution in Europe?”
  • “How did the Treaty of Versailles contribute to the rise of Adolf Hitler and the onset of World War II?”
  • “What are the emerging perspectives on the causes and impacts of the American Civil Rights Movement?”
  • “Why did the Roman Empire decline and eventually fall despite its extensive power and reach?”
  • “How effective were the New Deal programs in alleviating the effects of the Great Depression in the United States?”
  • “How did the processes of colonization and decolonization affect the political landscape of Africa in the 20th century?”
  • “What strategies did the suffragette movement use to secure voting rights for women in the early 20th century?”
  • “How did the logistics and strategies of the D-Day invasion contribute to the Allied victory in World War II?”

Importance of Research Questions

Research questions are fundamental to the success and integrity of any study. Their importance can be highlighted through several key aspects:

  • Research questions provide a clear focus and direction for the study, ensuring that the researcher remains on track.
  • Example: “How does online learning impact student engagement in higher education?”
  • They establish the boundaries of the research, determining what will be included or excluded.
  • Example: “What are the effects of air pollution on respiratory health in urban areas?”
  • Research questions dictate the choice of research design, methodology, and data collection techniques.
  • Example: “What is the relationship between physical activity and mental health in adolescents?”
  • They make the objectives of the research explicit, providing clarity and precision to the study’s goals.
  • Example: “Why do some startups succeed in securing venture capital while others fail?”
  • Well-crafted research questions emphasize the significance and relevance of the study, justifying its importance.
  • Example: “How effective are public health campaigns in increasing vaccination rates among young adults?”
  • They enable a systematic approach to inquiry, ensuring that the study is coherent and logically structured.
  • Example: “What are the social and economic impacts of remote work on urban communities?”
  • Research questions offer a framework for analyzing and interpreting data, guiding the researcher in making sense of the findings.
  • Example: “How does social media usage affect self-esteem among teenagers?”
  • By addressing specific gaps or exploring new areas, research questions ensure that the study contributes meaningfully to the existing body of knowledge.
  • Example: “What are the emerging trends in the use of artificial intelligence in healthcare?”
  • Clear and precise research questions increase the credibility and reliability of the research by providing a focused approach.
  • Example: “How do educational interventions impact literacy rates in low-income communities?”
  • They help in clearly communicating the purpose and findings of the research to others, including stakeholders, peers, and the broader academic community.
  • Example: “What strategies are most effective in reducing youth unemployment in developing countries?”

Research Question vs. Hypothesis

Chracteristics of research questions.

Chracteristics of Research Questions

Research questions are fundamental to the research process as they guide the direction and focus of a study. Here are the key characteristics of effective research questions:

1. Clear and Specific

  • The question should be clearly articulated and specific enough to be understood without ambiguity.
  • Example: “What are the effects of social media on teenagers’ mental health?” rather than “How does social media affect people?”

2. Focused and Researchable

  • The question should be narrow enough to be answerable through research and data collection.
  • Example: “How does participation in extracurricular activities impact academic performance in high school students?” rather than “How do activities affect school performance?”

3. Complex and Analytical

  • The question should require more than a simple yes or no answer and should invite analysis and discussion.
  • Example: “What factors contribute to the success of renewable energy initiatives in urban areas?” rather than “Is renewable energy successful?”

4. Relevant and Significant

  • The question should address an important issue or problem in the field of study and contribute to knowledge or practice.
  • Example: “How does climate change affect agricultural productivity in developing countries?” rather than “What is climate change?”

5. Feasible and Practical

  • The question should be feasible to answer within the constraints of time, resources, and access to information.
  • Example: “What are the challenges faced by remote workers in the tech industry during the COVID-19 pandemic?” rather than “What are the challenges of remote work?”

6. Original and Novel

  • The question should offer a new perspective or explore an area that has not been extensively studied.
  • Example: “How do virtual reality technologies influence empathy in healthcare training?” rather than “What is virtual reality?”
  • The question should be framed in a way that ensures the research can be conducted ethically.
  • Example: “What are the impacts of privacy laws on consumer data protection in the digital age?” rather than “How can we collect personal data more effectively?”

8. Open-Ended

  • The question should encourage detailed responses and exploration, rather than limiting answers to a simple yes or no.
  • Example: “In what ways do cultural differences affect communication styles in multinational companies?” rather than “Do cultural differences affect communication?”

9. Aligned with Research Goals

  • The question should align with the overall objectives of the research project or study.
  • Example: “How do early childhood education programs influence long-term academic achievement?” if the goal is to understand educational impacts.

10. Based on Prior Research

  • The question should build on existing literature and research, identifying gaps or new angles to explore.
  • Example: “What strategies have proven effective in reducing urban air pollution in European cities?” after reviewing current studies on air pollution strategies.

Benefits of Research Question

Research questions are fundamental to the research process and offer numerous benefits, which include the following:

1. Guides the Research Process

A well-defined research question provides a clear focus and direction for your study. It helps in determining what data to collect, how to collect it, and how to analyze it.

Benefit: Ensures that the research stays on track and addresses the specific issue at hand.

2. Clarifies the Purpose of the Study

Research questions help to articulate the purpose and objectives of the study. They make it clear what the researcher intends to explore, describe, compare, or test.

Benefit: Helps in communicating the goals and significance of the research to others, including stakeholders and funding bodies.

3. Determines the Research Design

The type of research question informs the research design, including the choice of methodology, data collection methods, and analysis techniques.

Benefit: Ensures that the chosen research design is appropriate for answering the specific research question, enhancing the validity and reliability of the results.

4. Enhances Literature Review

A well-crafted research question provides a framework for conducting a thorough literature review. It helps in identifying relevant studies, theories, and gaps in existing knowledge.

Benefit: Facilitates a comprehensive understanding of the topic and ensures that the research is grounded in existing literature.

5. Focuses Data Collection

Research questions help in identifying the specific data needed to answer them. This focus prevents the collection of unnecessary data and ensures that all collected data is relevant to the study.

Benefit: Increases the efficiency of data collection and analysis, saving time and resources.

6. Improves Data Analysis

Having a clear research question aids in the selection of appropriate data analysis methods. It helps in determining how the data will be analyzed to draw meaningful conclusions.

Benefit: Enhances the accuracy and relevance of the findings, making them more impactful.

7. Facilitates Hypothesis Formation

In quantitative research, research questions often lead to the development of hypotheses that can be tested statistically.

Benefit: Provides a basis for hypothesis testing, which is essential for establishing cause-and-effect relationships.

8. Supports Result Interpretation

Research questions provide a lens through which the results of the study can be interpreted. They help in understanding what the findings mean in the context of the research objectives.

Benefit: Ensures that the conclusions drawn from the research are aligned with the original aims and objectives.

9. Enhances Reporting and Presentation

A clear research question makes it easier to organize and present the research findings. It helps in structuring the research report or presentation logically.

Benefit: Improves the clarity and coherence of the research report, making it more accessible and understandable to the audience.

10. Encourages Critical Thinking

Formulating research questions requires critical thinking and a deep understanding of the subject matter. It encourages researchers to think deeply about what they want to investigate and why.

Benefit: Promotes a more thoughtful and analytical approach to research, leading to more robust and meaningful findings.

How to Write a Research Question

Crafting a strong research question is crucial for guiding your study effectively. Follow these steps to write a clear and focused research question:

Identify a Broad Topic:

Start with a general area of interest that you are passionate about or that is relevant to your field. Example: “Climate change”

Conduct Preliminary Research:

Explore existing literature and studies to understand the current state of knowledge and identify gaps. Example: “Impact of climate change on agriculture”

Narrow Down the Topic:

Focus on a specific aspect or issue within the broad topic to make the research question more manageable. Example: “Effect of climate change on crop yields”

Consider the Scope:

Ensure the question is neither too broad nor too narrow. It should be specific enough to be answerable but broad enough to allow for thorough exploration. Example: “How does climate change affect corn crop yields in the Midwest United States?”

Determine the Research Type:

Decide whether your research will be descriptive, comparative, relational, or causal, as this will shape your question. Example: “How does climate change affect corn crop yields in the Midwest United States over the past decade?”

Formulate the Question:

Write a clear, concise question that specifies the variables, population, and context. Example: “What is the impact of increasing temperatures and changing precipitation patterns on corn crop yields in the Midwest United States from 2010 to 2020?”

Ensure Feasibility:

Make sure the question can be answered within the constraints of your resources, time, and data availability. Example: “How have corn crop yields in the Midwest United States been affected by climate change-related temperature increases and precipitation changes between 2010 and 2020?”

Review and Refine:

Evaluate the question for clarity, focus, and relevance. Revise as necessary to ensure it is well-defined and researchable. Example: “What are the specific impacts of temperature increases and changes in precipitation patterns on corn crop yields in the Midwest United States from 2010 to 2020?”

What is a research question?

A research question is a specific query guiding a study’s focus and objectives, shaping its methodology and analysis.

Why is a research question important?

It provides direction, defines scope, ensures relevance, and guides the methodology of the research.

How do you formulate a research question?

Identify a topic, narrow it down, conduct preliminary research, and ensure it is clear, focused, and researchable.

What makes a good research question?

Clarity, specificity, feasibility, relevance, and the ability to guide the research effectively.

Can a research question change?

Yes, it can evolve based on initial findings, further literature review, and the research process.

What is the difference between a research question and a hypothesis?

A research question guides the study; a hypothesis is a testable prediction about the relationship between variables.

How specific should a research question be?

It should be specific enough to provide clear direction but broad enough to allow for comprehensive investigation.

What are examples of good research questions?

Examples include: “How does social media affect academic performance?” and “What are the impacts of climate change on agriculture?”

Can a research question be too broad?

Yes, a too broad question can make the research unfocused and challenging to address comprehensively.

What role does a research question play in literature reviews?

It helps identify relevant studies, guides the search for literature, and frames the review’s focus.

Twitter

Text prompt

  • Instructive
  • Professional

10 Examples of Public speaking

20 Examples of Gas lighting

AI Governance, Literacy, and the Power of Connection: Three themes from J-WEL Week 2024

AI Governance, Literacy, and the Power of Connection: Three themes from J-WEL Week 2024

Ai governance, literacy, and the power of connection: 3 themes from j-wel week 2024, attendees explored big picture questions, and collective solutions, to addressing generative artificial intelligence in an educational setting..

By Carolyn Tiernan

As the impact of generative AI technology on education continues to dominate headlines, the MIT Jameel World Education Lab (J-WEL) at MIT Open Learning and its member institutions are exploring this topic head-on.

They recently gathered for J-WEL Week, the lab’s flagship in-person event that is themed each year around a topic that is top-of-mind for member institutions. This year’s topic: generative AI in education. After a week of shared conversations, collaborative workshops, insightful presentations, and in-person demonstrations from the MIT community designed to deepen understanding and fuel collaboration, three key themes emerged.

University governance of AI is taking shape in real time

Daniel Huttenlocher speaks at a podium with an MIT banner behind him.

Several sessions during J-WEL Week addressed the theme of governance, as all member institutions grapple with what policies to implement (or not) regarding the use of generative AI.

Not only is there concern that students will use the technology to bypass real learning, educators are uncertain about the level of understanding they need to have of generative AI tools.

During day 1 member updates, two universities shared how they are tackling the question of AI governance. Professor Melchor Sanchez Mendiola spoke about the GenAI in Education working group that was created at the National Autonomous University of Mexico (UNAM) to explore this topic. The working group, including engineers, educators, sociologists, computer specialists, and distance education specialists, is proposing guidelines for UNAM that take into account different perspectives across the university ecosystem.

Similarly, Professor Fabio Cozman spoke about a workshop hosted in 2023 by the Center for Artificial Intelligence and Machine Learning at the University of São Paulo (USP). The group met to discuss the impact of ChatGPT and other generative AI models on education, and ultimately produced a set of recommendations on how to study and discuss AI within USP.

MIT Schwarzman College of Computing Dean Daniel Huttenlocher focused on AI governance from a macro lens during his presentation on day two. Huttenlocher shared a brief history of AI, contextualizing the rapid advancement in the large language models (LLMs) of today. He also cautioned against creating AI policies in a vacuum, instead recommending that AI be worked into existing policies.

All week, presenters and attendees alike agreed that a perspective shift of generative AI is needed. Rather than looking down on AI as a way to circumvent original thought and work, they are thinking of it as a tool to spur creativity and enable iteration and feedback for learners in all disciplines.

Baseline AI literacy is necessary for educators

Four people sit at a desk with laptops. One person is talking.

The question of AI literacy was also prevalent: how much expertise does an educator need on generative AI? Members agreed that a baseline understanding of AI tools was necessary for both educators and students, drawing metaphors to calculators and word processors. These were similarly disruptive educational tools that all educators ultimately needed working knowledge of, and attendees agreed that generative AI will be no different.

Professor Eric Klopfer summarized this viewpoint in his presentation outlining the mission of MIT’s Responsible AI for Social Empowerment and Education initiative (MIT RAISE). Klopfer defined AI literacy as “a set of competencies that allow students to be informed users and responsible producers of AI,” stressing that understanding the fundamentals of AI is important for making informed decisions and participating in our civic, work, and social lives.

Generative AI in Global Education Lead at J-WEL, David Dixon, tested attendees’ AI literacy with a hands-on workshop focused on writing better prompts and building custom GPTs. This allowed members to see real-life examples of how working knowledge of ChatGPT, Copilot, and other LLMs can be a tool in an educators toolbelt, and test out creating their own custom GPTs.

In-person gatherings foster unique connection to the MIT community, and each other

People sitting around a conference table working, as a person adds a sticky note to a white board at the end of the room. One person is standing speaking.

J-WEL’s member institutions regularly report an intangible value in gathering in-person for J-WEL Week. The open dialogue, workshops, and impromptu conversations during coffee breaks allow members to find common challenges and opportunities amongst each other, while the talks from faculty, students, and staff uniquely connect attendees to MIT’s culture of excellence in education. The agenda was designed with these insights in mind.

The final two days of J-WEL Week, members participated in a design thinking workshop created by J-WEL Faculty Director Anjali Sastry and J-WEL Research Scientist Joe Doiron. Attendees co-created toolkits and recommendations on AI governance, literacy, prompting, and using AI tools with students, with the goal of taking home a set of shared resources to introduce to their home institutions and apply outside of J-WEL Week.

Paula Elksne, director of the Education Innovation Lab at Riga Business School, summed up in Thursday’s closing remarks that J-WEL Week is a “good mix of hands-on work and hearing from others [about AI].” She added, “I’m now thinking more of how we can keep each other accountable to what we’ve learned and support each other as we go to implement [the learnings].”

For additional resources, explore MIT’s collection of Exploration of Generative AI papers and the Sloan School of Management’s Generative AI for Teaching and Learning resources.

Part of MIT Open Learning, the Jameel World Education Lab enables research and outreach with faculty from across MIT, 17 member institutions, and educational innovators worldwide to transform learning at scale.

research questions online learning

AI Governance, Literacy, and the Power of Connection: Three themes from J-WEL Week 2024 was originally published in MIT Open Learning on Medium, where people are continuing the conversation by highlighting and responding to this story.

Open Learning newsletter

Duke Learning Innovation and Lifetime Education

Using Trauma-Informed Teaching to Handle Sensitive Topics in Online Teaching: A Case Study from Divinity

How can Christian faith influence the practice of mental health care? How do we define mental health and mental illness — and what are the limitations of these concepts? How can Christian theology provide us with tools to ethically engage with challenges related to mental health care? When developing their new graduate-level course, Christian Approaches to Mental Health Care , Professors Warren Kinghorn and John Swinton created a course where students could explore these questions.

“Theology provides us with a language, a worldview and a set of practices that are vital for mental health care,” Swinton said. “The intention of this course is to help people to think theologically and provide people with other ideas and tools to help them care more fully and more faithfully.”

A hybrid, nine-week course, Christian Approaches to Mental Health Care is part of a new mental health track in the Divinity School’s Certificate in Theology and Health Care . Students attended an immersive week on Duke’s campus where they were introduced to key concepts; they attended course sessions the following eight weeks via Zoom.

“In the Theology, Medicine, and Culture Initiative, we invite students into deep engagement with the practices of health care in light of Christian theology and practice,” Kinghorn said. “This class is a natural extension of that aim, but now focused specifically on the practices of mental health care.” 

This Divinity course was aimed at both professionals who work in any context related to mental health care (e.g., counselors, social workers) and those who may have a personal interest in the material. First taught in Spring 2024 to over 30 students, the course covered a wide-range of topics that needed to be handled with care, including trauma, anxiety, and dementia. 

Challenges and Affordances of Teaching Online

How does one approach teaching sensitive topics online? Are there challenges unique to this environment? 

Swinton noted that the structure of a course using a web conferencing software like Zoom can present challenges.

“When you are in a standard class situation and difficult issues come up, people have the opportunity to find immediate support either from us as tutors or from classmates,” he said. “When you are doing a course online, that support is often not available. You switch off the camera and you are on your own.”

That being said, knowing this can be an issue without the buffer space at the beginning or end of the class session, Swinton and Kinghorn have worked to ensure their online sessions do offer support. Kinghorn noted that during his first foray into online teaching during the onset of the COVID-19 pandemic, he was concerned about engaging with sensitive topics in a virtual class setting.

“But I learned that if the class is structured in a way that leaves time and space for self-care, Zoom-based instruction has advantages,” he said. “Students generally join class from a space where they feel safe. They can titrate their own exposure in class, shifting position or even briefly turning off camera or microphone if necessary. If done well and in a trauma-informed way, I think that even the hardest topics can be addressed well through online pedagogy.”

Strategies to Teach a Trauma-Informed Course

What does it mean to be a trauma-informed instructor? How can Kinghorn and Swinton’s course act as an example for other courses?

“Adopting a trauma-informed approach is not accomplished through any single particular technique or checklist,” the CDC’s 6 Guiding Principles To A Trauma-Informed Approach reminds us, “It requires constant attention, caring awareness, sensitivity, and possibly a cultural change at an organizational level.”

Kinghorn and Swinton, however, do have specific strategies that worked for the specific context of their course. Given that this course covered topics such as depression, suicide, substance use issues, and more during the online weeks, the instructors informed students about upcoming content and encouraged them to approach these topics with care for themselves and for others. While there are different approaches to creating a trauma-informed environment, Kinghorn and Swinton focused on harm reduction in suggesting how instructors can think about this during a live course.

“My belief (as others have argued in the trauma-informed teaching literature) is that it should be a last resort for professors to invite students to opt out of a class due to difficult past experiences, as this kind of approach (if made systematic) can compound educational differences and marginalization— effectively giving educational access to those fortunate not to have particular trauma histories and excluding (even if student-driven) those who do,” Kinghorn said.  “I would much rather structure courses (as I think we did this semester) with the goal that every student, including those with recent loss or difficult trauma histories, feels a sense of welcome and belonging that enables them to participate in the good, hard work of the course.”

Swinton shared that choice is important to him in thinking about student participation.

“For example, if an online student has had someone close to them die by suicide either recently or historically and they feel that a class on this subject could be problematic for them, we would want to make sure that the person involved does not feel compelled to participate in the class if it is going to be harmful,” Swinton said of a hypothetical scenario. “It’s always going to be difficult with sensitive issues online, but we do the best we can to minimize the possibility of harm.”

Other strategies to help students feel like the course was a safe place for student learning included:

  • Focusing on helping students create a connected community, which included putting them in consistent breakout room groups of 5-6 students for the semester
  • Modelling honesty and humility as instructors  when receiving difficult questions during the immersive week
  • Starting each class session with grounding practices (e.g., reading a psalm and pausing in silent reflection)
  • Staying after class in Zoom to be available to students and ensuring presence and privacy by turning off recording features
  • Providing clear guidelines about evaluations and deadlines, as well as sharing how students should communicate with faculty and teaching assistants
  • Granting structured flexibility (e.g., offering a no-questions-asked 72-hour extension for one assignment over the semester)
  • Respecting students’ privacy (e.g., not asking students to disclose why they need to turn in late work)

Teaching with Care

How might you begin to integrate trauma-informed practices into your own teaching? 

The first step, Swinton said, is to recognize that this is an issue instructors should take note of in all of their teaching.

Using Duke as an example, Kinghorn elaborated that while instructors should not be asking students to disclose their trauma, they can assume “that a majority or near-majority of every class of Duke students have survived trauma of some sort (recognizing the challenges of defining that term).”

“Assume that all students, including trauma survivors, are at Duke because they want to learn, want to engage difficult material well, and generally want to do good, hard work together,” he said. “Then ask: how specific to my subject matter can I invite students to do good, hard work in a way that respects their lived experience? That’s going to differ by class and by student but is the place to start.”

If you’d like to learn more about trauma-informed teaching, here are a few resources where you could start are:

  • Trauma-Informed Teaching – University of Wisconsin, Lacrosse
  • Trauma-Informed Pedagogy, Montclair State University
  • Karen Costa’s Trauma Informed Pedagogy Course
  • A feature of a recent Coursera course that integrated trauma-informed frameworks into its design
  • SAMHSA, Trauma and Violence
  • The Missouri Model: A Developmental Framework for Trauma-Informed Approaches
  • CDC, 6 Guiding Principles to a Trauma-Informed Approach
  • Potentially Perilous Pedagogies: Teaching Trauma Is Not the Same as Trauma-Informed Teaching

If you’d like to learn more about Warren Kinghorn and John Swinton’s work, here are a few places to start are:

  • Finding Jesus in the Storm: The Spiritual Lives of Christians with Mental Health Challenges (2020)
  • Wayfaring: A Christian Approach to Mental Health Care by Warren Kinghorn (July 2024)
  • Developing Best Practices for Trauma-Informed Teaching and Learning

IMAGES

  1. Research Question Generator

    research questions online learning

  2. Draft of Questionnaire on Students' responses to online learning

    research questions online learning

  3. What Is a Research Question? Tips on How to Find Interesting Topics

    research questions online learning

  4. How to Write a Research Question in 2024: Types, Steps, and Examples

    research questions online learning

  5. Research Question Examples: Ultimate Guide For 2023

    research questions online learning

  6. How to Develop a Strong Research Question

    research questions online learning

VIDEO

  1. questions paper of research methodology for BBA students

  2. Research collaborations: Self Directed Learning Questionnaire, Explanations

  3. What is a research question?

  4. Using Online Databases to Find Research Articles

  5. २.९.१ सूचनाको हक [Part 1]

  6. English Introduction class for Section Officer

COMMENTS

  1. 45 Survey Questions to Understand Student Engagement in Online Learning

    45 questions for your district to ask students, families, and staff during the 2020-21 academic year to understand student engagement in online learning. ... Research suggests that some groups of students experience more difficulty with academic performance and engagement when course content is delivered online vs. face-to-face.

  2. PDF A Systematic Review of the Research Topics in Online Learning During

    Table 1 summarizes the 12 topics in online learning research in the current research and compares it to Martin et al.'s (2020) study, as shown in Figure 1. The top research theme in our study was engagement (22.5%), followed by course design and development (12.6%) and course technology (11.0%).

  3. 80+Remote Learning Survey Questions for Students ...

    In this article, we've put together a list of the 80 best remote learning survey questions you can ask students, parents, and teachers to optimize and design effective learning experiences. Here's everything we'll cover: 47 Remote Learning Survey Questions for Students. 27 Remote Learning Survey Questions for Parents.

  4. Frontiers

    BackgroundThe effectiveness of online learning in higher education during the COVID-19 pandemic period is a debated topic but a systematic review on this topic is absent.MethodsThe present study implemented a systematic review of 25 selected articles to comprehensively evaluate online learning effectiveness during the pandemic period and identify factors that influence such effectiveness ...

  5. Distance learning survey for students

    Distance education lacks proximity with teachers and has its own set of unique challenges. Some students may find it difficult to learn a subject and take more time to understand. This question measures the extent to which students find their teachers helpful. You can also use a ready-made survey template to save time.

  6. Insights Into Students' Experiences and Perceptions of Remote Learning

    This spring, students across the globe transitioned from in-person classes to remote learning as a result of the COVID-19 pandemic. This unprecedented change to undergraduate education saw institutions adopting multiple online teaching modalities and instructional platforms. We sought to understand students' experiences with and perspectives on those methods of remote instruction in order to ...

  7. A systematic review of research on online teaching and learning from

    This review enabled us to identify the online learning research themes examined from 2009 to 2018. In the section below, we review the most studied research themes, engagement and learner characteristics along with implications, limitations, and directions for future research. 5.1. Most studied research themes.

  8. Effectiveness of online and blended learning from schools: A systematic

    This systematic analysis examines effectiveness research on online and blended learning from schools, particularly relevant during the Covid-19 pandemic, and also educational games, computer-supported cooperative learning (CSCL) and computer-assisted instruction (CAI), largely used in schools but with potential for outside school.

  9. A Systematic Review of the Research Topics in Online Learning During

    The systematic review results indicated that the themes regarding "courses and instructors" became popular during the pandemic, whereas most online learning research has focused on "learners" pre-COVID-19. Notably, the research topics "course and instructors" and "course technology" received more attention than prior to COVID-19.

  10. Key findings about online learning and the ...

    A year into the outbreak, an increasing share of U.S. adults said that K-12 schools have a responsibility to provide all students with laptop or tablet computers in order to help them complete their schoolwork at home during the pandemic. About half of all adults (49%) said this in the spring 2021 survey, up 12 percentage points from a year ...

  11. (PDF) A Systematic Review of the Research Topics in Online Learning

    Table 1 summarizes the 12 topics in online learning research in the current research and . compares it to Martin et al. 's (2020) study, as shown in Figure 1. The top research theme in our .

  12. Examining research on the impact of distance and online learning: A

    Distance learning has evolved over many generations into its newest form of what we commonly label as online learning. In this second-order meta-analysis, we analyze 19 first-order meta-analyses to examine the impact of distance learning and the special case of online learning on students' cognitive, affective and behavioral outcomes.

  13. Online and face‐to‐face learning: Evidence from students' performance

    Using students' responses from survey questions and the difference in the average grades between pre‐lockdown and post‐lockdown, our findings indicate that students' performance in the online setting was positively associated with better internet access. ... Journal of Interactive Learning Research, 11 (1), 29-49. [Google Scholar ...

  14. PDF Evaluation of Evidence-Based Practices in Online Learning

    effectiveness of online learning and (b) a meta-analysis of those studies from which effect sizes that contrasted online and face-to-face instruction could be extracted or estimated. A narrative summary of studies comparing different forms of online learning is also provided. These activities were undertaken to address four research questions: 1.

  15. 206 questions with answers in ONLINE LEARNING

    Online Learning - Science topic. Explore the latest questions and answers in Online Learning, and find Online Learning experts. Questions (206) Publications (338,281) Questions related to Online ...

  16. The Impact of Online Learning on Student's Academic Performance

    The spread of online learning has grown exponentially at every academic level and in many. countries in our COVID-19 world. Due to the relatively new nature of such widespread use of. online learning, little analysis or studies have been conducted on whether student performance.

  17. (Pdf) Research on Online Learning

    The CoI model has formed the basis for a good deal of research on online learning. Most of this research. has focused on one of the three pr esences, social presence being the most frequently ...

  18. The effects of online education on academic success: A meta ...

    The purpose of this study is to analyze the effect of online education, which has been extensively used on student achievement since the beginning of the pandemic. In line with this purpose, a meta-analysis of the related studies focusing on the effect of online education on students' academic achievement in several countries between the years 2010 and 2021 was carried out. Furthermore, this ...

  19. Traditional Learning Compared to Online Learning During the COVID-19

    Accordingly, faculty members have been given online learning training on how to implement online teaching by online teaching experts through the university's electronic platforms to teach students remotely through the Internet (Rucker & Frass, 2017). This study aims to analyze the implications of the shift to online learning from a faculty ...

  20. 10 Research Question Examples to Guide your Research Project

    The first question asks for a ready-made solution, and is not focused or researchable. The second question is a clearer comparative question, but note that it may not be practically feasible. For a smaller research project or thesis, it could be narrowed down further to focus on the effectiveness of drunk driving laws in just one or two countries.

  21. Towards Understanding Online Question & Answer Interactions and their

    Online question & answer (Q & A) is a distinctive type of online interaction that is impactful on student learning. Prior studies on online interaction in large-scale classes mainly focused on online discussion and were conducted mainly in non-STEM fields. This research aims to quantify the effects of online Q & A interactions on student performance in the context of STEM education. 218 ...

  22. Frequently Asked Questions About Online Education

    Recent reports detail just how quickly colleges adopted online learning. According to the Babson Survey Research Group, university and student participation in online education is at an all-time high. Even some of the largest and most prestigious universities now offer online degrees. Despite its growing popularity, online education is still ...

  23. Top 6 Questions People Ask About Online Learning

    The answer is an emphatic "no." Most online programs appear on your transcript the same as on-campus programs would. You may also wonder if an online program will impact your plans for a higher degree later. As long as your degree is from an accredited institution, it won't harm your chances of acceptance. 4.

  24. Shaping the Future of Online Learning

    CB: Because of the rapid and massive shift to online that occurred around the globe in the spring of 2020, the landscape changed permanently. There are many things shaping the future but here are just a few that I can see from my perspective: Increased adoption of online learning across all ages and levels of education: Everyone expanded their ...

  25. Full article: Written argumentation research in English and science: a

    To address these gaps, two research questions will be answered via this scoping review: Research Question 1 (RQ1): What is the state of primary and secondary written argumentation research in terms of demographics and research designs? ... Balancing conceptual, epistemic, and social learning goals. Review of Research in Education, 32, 268-291 ...

  26. The contribution of students' learning styles to competences

    Introduction. Students fail to successfully assimilate and understand new concepts through traditional teaching-learning strategies (Troussas et al., Citation 2023), so teachers are forced to seek new and more effective approaches that involve the use of new technologies.Some studies have highlighted the relationship between students' learning styles and the use of digital games in the ...

  27. Research Question

    A well-formulated research question is essential for guiding your study effectively. Follow this format to ensure clarity and precision: Specify the Topic: Begin with a broad subject area. Example: "Education technology". Narrow the Focus: Define a specific aspect or variable. Example: "Impact of digital tools".

  28. Integration of podcasts in blended learning to expand nursing

    The integration of the Podcast as part of the teaching strategies to improve the students' learning was designed based on a blended learning approach. The topics in the podcast were determined by the module team according to the curriculum and were created using Bloom's taxonomy as a framework to target information for each topic. 4 The ...

  29. AI Governance, Literacy, and the Power of Connection: Three themes from

    AI Governance, Literacy, and the Power of Connection: 3 themes from J-WEL Week 2024 Attendees explored big picture questions, and collective solutions, to addressing generative artificial intelligence in an educational setting. J-WEL Week attendees and staff pose for a group photo on the steps of building E52, the Morris and Sophie Chang Building at MIT.

  30. Using Trauma-Informed Teaching to Handle Sensitive Topics in Online

    Kinghorn noted that during his first foray into online teaching during the onset of the COVID-19 pandemic, he was concerned about engaging with sensitive topics in a virtual class setting. "But I learned that if the class is structured in a way that leaves time and space for self-care, Zoom-based instruction has advantages," he said.