learning environment research papers

Learning Environments Research

An International Journal

This is a transformative journal , you may have access to funding.

  • Jill Aldridge,
  • Barry Fraser

learning environment research papers

Latest issue

Volume 26, Issue 3

Latest articles

Students’ descriptions of belonging experiences in post-secondary settings.

  • A. Dana Ménard
  • Arianna Pitre
  • Laura Chittle

Exploring the relationship between the learning environment and bullying: PLS-SEM evidence from Norwegian higher education

  • Emmanuel Mensah Kormla Tay
  • Stephen Zamore

learning environment research papers

How to design ‘cultivated spaces’ in active learning classrooms: analysis of faculty reflections on learning space

  • Merve Basdogan
  • Tracey Birdwell

learning environment research papers

Assessing students’ perceptions of school climate in primary schools

  • J. M. Aldridge
  • M. J. Blackstock

Designing a caring classroom community: a propensity score matching study

  • Laurie O. Campbell
  • Caitlin Frawley
  • Xueying Gao

learning environment research papers

Journal information

  • Emerging Sources Citation Index
  • Google Scholar
  • OCLC WorldCat Discovery Service
  • TD Net Discovery Service
  • UGC-CARE List (India)

Rights and permissions

Springer policies

© Springer Nature B.V.

  • Find a journal
  • Publish with us
  • Track your research
  • Open supplemental data
  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, creating a supportive classroom environment through effective feedback: effects on students’ school identification and behavioral engagement.

www.frontiersin.org

  • 1 Centro de Investigação em Educação, ISPA – Instituto Universitário, Lisboa, Portugal
  • 2 UIDEF, Instituto de Educação da Universidade de Lisboa, Lisboa, Portugal

Previous research revealed the connection between students’ behavioral and emotional engagement and a supportive classroom environment. One of the primary tools teachers have to create a supportive classroom environment is effective feedback. In this study, we assessed the supportive classroom environment using the perception shared by all students from the same classroom of teachers’ use of effective feedback. We aimed to explore the effect of such an environment on students’ behavioral engagement and school identification. Using a probabilistic sample of 1,188 students from 75 classrooms across 6th, 7th, 9th, and 10th grades, we employed multilevel regression modeling with random intercept and fixed slopes. We explored the effects of both individual perceptions of teachers’ use of effective feedback and the supportive classroom environment on student engagement. The analyses identified that students who perceived that their teachers use more effective feedback had a higher level of behavioral engagement and school identification. Once we controlled the effects of these individual perceptions of teachers’ effective feedback, we still observed the effect of a supportive classroom environment on student engagement. So, in classrooms where teachers used more effective feedback creating a supportive classroom environment, students had higher school identification and behavioral engagement levels, regardless of their individual perceptions of teachers’ feedback. The association between variables remained significant even after controlling students’ characteristics (gender, nationality, mother’s level of education, history of grade retention) and classroom characteristics (grade level, type of school, number of students at grade level). Our findings support the potential of teachers’ feedback practices to foster students’ school identification and behavioral engagement to build a more inclusive school environment and value students’ diversity.

Introduction

Students’ behavioral engagement and school identification are considered a critical catalyst for their learning and performance ( Korpershoek et al., 2019 ). Students who value school and feel that they belong there are more likely to behaviorally engage in school activities, experience more in-depth learning, and improve their academic achievement ( Voelkl, 2012 ). These feelings can contribute to reducing school dropout and social exclusion. According to Voelkl (2012) , the development of a sense of identification is mediated by contextual factors–namely, perception of teacher support. These factors can be modified to improve school outcomes. According to Voelkl (2012) , a caring, supportive teacher can impact students’ identification with school. If students feel that they are cared for and are allowed to participate actively in classroom activities, they believe that the school climate is positive, supportive and it promotes the sense of belonging and value of the school ( Adomnik, 2012 ). Therefore, understanding what teachers can do to support and foster students’ engagement is vital. In the present study, we investigated one factor identified as having critical effects on students’ achievement and students’ engagement: teachers’ feedback ( Wisniewski et al., 2020 ). When performing learning tasks and activities, feedback is a relevant aspect present in the teacher-student relationship that can create a positive and supportive classroom environment ( Black and Wiliam, 1998 ; Black et al., 2004 ; Voelkl, 2012 ). Feedback may have consequences on students’ school experience, subsequently improving or impairing their school identification and behavioral engagement and, in turn, affecting their academic achievement ( Reeve, 2012 ; Reschly and Christenson, 2012 ; Voelkl, 2012 ; Burns et al., 2019 ; Wang and Zhang, 2020 ). Previous research has demonstrated that students’ perception of teachers’ use of feedback plays a significant role in student engagement ( Koka and Hein, 2005 , Koka and Hein, 2006 ; Price et al., 2011 ; Leh et al., 2014 ; Conboy et al., 2015 ; Burns et al., 2019 ; Kyaruzi et al., 2019 ; Wang and Zhang, 2020 ). Most of this research had investigated perceived teacher feedback at the individual level (e.g., Koka and Hein, 2005 : Koka and Hein, 2006 ; Leh et al., 2014 ; Conboy et al., 2015 ; Vattøy and Smith, 2019 ; Wang and Zhang, 2020 ). This means that the effectiveness of teacher feedback can promote learning, increase achievement and foster student motivation and engagement.

Thus, as mentioned before, students’ perception of teacher “feedback has individual effects on students” engagement and on their school identification ( Pianta et al., 2012 ; Voelkl, 2012 ). However, the teaching and learning process is not only a simple relationship between the teacher and students, but also among students themselves. In this interrelation, teachers’ behaviors are fundamental in promoting positive interactions in the classroom ( Conroy et al., 2009 ). As teachers and students share several learning environments and experiences, they build perceptions about the teaching-learning process that allows them to make interpretations about the interactive dynamics in the classroom in a very consistent way. In these interactions, teachers can help model constructive feedback and can help develop the group’s competence to give effective feedback and create a positive classroom climate, increasing students’ engagement.

Consequently, it is relevant to understand how the context created by teachers’ feedback are likely to impact on students’ behavioral engagement and on their school identification. Based on previous studies (e.g., Burns et al., 2019 ; Kyaruzi et al., 2019 ), we suggest that the use of effective feedback (assessed by the shared perceptions among students of the same classroom about their teachers’ feedback) create a supportive classroom environment that will positively influence of students’ school identification.

The majority of research regarding students’ perceived feedback and their engagement has focused on the student-level characteristics with less consideration for the contexts in which they are taught ( Burns et al., 2019 ). Therefore, in the present study, we used a multilevel design to investigate how these factors function at both the student and classroom level. We studied the link between perceived teachers’ use of effective feedback and students’ levels of school identification and behavioral engagement at the individual and classroom levels. The central question is whether the supportive classroom environment created by the teachers' use of effective feedback affects students’ behavior after controlling their individual perceptions and the differences at the individual level and at classroom-level.

Teachers’ Feedback

One of the primary tools teachers have to create this supportive class environment is feedback ( Price et al., 2011 ; Reeve, 2012 ; Reschly and Christenson, 2012 ). Feedback is conceptualized as information students receive about their performance or understanding ( Hattie and Timperley, 2007 ) that reduces the discrepancy between what the student knows and what is aimed to be known. Students must also make sense of that information and use it to enhance their learning (Carless and Bound, 2018).

Much has been studied about the effectiveness of feedback, but there is much more to learn about how to optimize its power in the classroom. As Janosz (2012) indicated, the feedback information that students receive and interpret from their schooling experience plays a crucial role in assisting students in improving their motivation and engagement and is a decisive factor implicated in academic achievement ( Hattie and Timperley, 2007 ). Nevertheless, we also know that the variability of feedback effectiveness is vast and that there are certain types of feedback that are more effective than others ( Hattie and Yates, 2014 ). Thus, different types of feedback allow the student to close the gap between current knowledge and a more desirable level of achievement with different levels of effectiveness. Hattie and Timperley (2007) specified some forms it should take; The authors use three feedback questions such as where am I going (feeding up), how am I going (feeding back) and where to next (feeding forward) to clarify the goals and criteria for students. For feedback to be effective, these questions must be answered by the student and feedback needs to work at different levels of cognitive complexity: Task and product level, i.e., corrective feedback; Process level, i.e., providing task processing strategies and cues for information search so students can develop their own learning strategies; Self-regulation level, i.e., providing students with information that allows them to improve their competence to monitor their own learning and progress. According to the authors ( Wisniewski et al., 2020 ), feedback is more effective the more information it contains. So high-information feedback contains information on task, process and (sometimes) self-regulation.

Hattie and Timperley (2007) considered that the feedback needs to focus on the appropriate question and level of cognitive complexity. If not, it risks being ignored and misunderstood and never used by the student. Generally, it has been shown that feedback at the process and self-regulation levels seems to be more effective in enhancing deeper learning, improving task confidence and self-efficacy, and leading to more internal attributions about success or failure ( Hattie and Yates, 2014 ). Furthermore, the meta-analyses of Wisniewski et al. (2020) also suggest that feedback is more effective the more information it contains, while simple forms of reinforcement and punishment have low effects.

The literature also suggests that feedback is related to a positive student-teacher relationship, which is an essential aspect of a positive classroom environment (e.g., Burnett, 2002 ; Gutierrez and Buckley, 2019 ). Burnett (2002) observed that students who perceived receiving feedback focused on their effort were more likely to report a positive teacher-student relationship. The author also reported that students who perceived receiving frequent ability feedback from their teachers were also more likely to perceive the classroom environment in a positive way. On the contrary, teacher praise was not related to students’ perception of the classroom environment or their relationships with their teachers.

Therefore, teachers’ feedback is crucial in improving this supportive class environment by establishing good relationships with students and offering both personal and academic support ( Allen et al., 2018 ). Studies have also determined that a supportive class environment could improve students’ school identification and behavioral engagement ( Voelkl, 2012 ; Allen et al., 2018 ; Olivier et al., 2020 ). Students need to be supported and cared for by teachers to develop and maintain a sense of identification with the school that reinforces their behavioral engagement with the school’s activities ( Voelkl, 2012 ). So, Burnett (2002) recommends that teachers should be careful when providing feedback to students as their relationships with students can influence how students perceive the classroom environment.

In sum, feedback is more effective if it helps students understand what mistakes they made, why they made these mistakes, and what they can do to avoid them in future ( Wisniewski et al., 2020 ). Therefore, the effective feedback sets clear standards and expectations that promote a supportive classroom environment, encouraging students’ autonomy, school identification and engagement ( Pianta et al., 2012 ; Voelkl, 2012 ).

Behavioral Engagement and School Identification

The role of student engagement has been considered to be relevant in the literature since authors identified that it improves achievement and persistence in secondary school ( Finn and Zimmer, 2012 ; Korpershoek et al., 2019 ). Engagement is a complex multidimensional construct defined as

the energy and effort that students employ within their learning community, observable via any number of behavioral, cognitive or affective indicators across a continuum. It is shaped by a range of structural and internal influences, including the complex interplay of relationships, learning activities and the learning environment ( Bond et al., 2020 , p. 3).

Similarly, well supported by research, school identification has become an important educational goal (e.g., Christenson et al., 2008 ; Christenson et al., 2008 ; Reschly and Christenson, 2012 ; Voelkl, 2012 ). School identification can be defined as students’ attitudes about their school, and it is an affective form of student engagement comprising two needs: Belongingness and Valuing. Belongingness refers to “feelings that one is a significant member of the school community, is accepted and respected in school, has a sense of inclusion in school, and includes school as part of one’s self-definition.” ( Voelkl, 1996 , p. 762). On the other hand, Valuing has been defined as students “feeling that school and school outcomes have personal importance and/or practical importance” ( Voelkl, 2012 , p. 198).

School identification, also referred to in the literature as affective engagement ( Christenson et al., 2008 ; Reschly and Christenson, 2012 ), is strongly related to behavioral engagement ( Voelkl, 2012 ; Korpershoek et al., 2019 ); the latter is associated with students' active participation and involvement in school and classroom activities, their effort, attendance, active classroom participation, paying attention and homework completion ( Appleton et al., 2006 ; Fredricks et al., 2011 ). Students who identify with school tend to engage in classroom activities more than others. Research shows that students’ behavioral engagement mediates the relation between school identification and students’ academic trajectories ( Reschly and Christenson, 2006 ; Voelkl, 2012 ). Students who develop a sense of identification with the school are more involved in classroom work, actively participating in their learning and autonomously developing new activities, improving their academic achievement ( Korpershoek et al., 2019 ). As indicated by Voelkl (2012) , “classroom participation is the most proximal outcome of identification” (p. 208). Contrarily, students who do not have a sense of belonging or value their school are more likely to disengage or withdraw, and soon drop out ( Voelkl, 2012 ; Lovelace et al., 2014 ; Lovelace et al., 2017 ).

Teachers’ Feedback, School Identification and Engagement

Although recent meta-analyses had found that feedback that contains information on task, process and self-regulation levels is more effective for cognitive outcomes, like students’ achievement ( Wisniewski et al., 2020 ), research also supports that it enhances academic engagement and motivational outcomes ( Gettinger and Ball, 2007 ; Valente et al., 2015 ; Wisniewski et al., 2020 ). In addition, according to Wang and Zhang (2020) , learning engagement had a mediating effect on the relationship between teachers’ feedback and students’ academic performance. The association between teachers’ feedback and students’ engagement seems to exist regardless of the students liking or disliking the learning subject ( Valente et al., 2015 ), although the utility of the feedback depends on how students perceive it ( Handley et al., 2011 ; Kyaruzi et al., 2019 ; Wang and Zhang, 2020 ). Feedback that “draws attention away from the task and toward self-esteem can have a negative effect on attitudes and performance” ( Black and Wiliam, 1998 , p. 13). Hattie (2009) indicates that feedback directed to the self or at the self-level, even if it is positive, like praise, often directs attention away from the task, diluting the power of feedback. Negative and uninformative feedback has the most evident negative influences, because it reduces the experience of autonomy and self-efficacy and because students need to feel that they belong in learning and that there is a trusting relationship between them, their teachers and their peers ( Hattie, 2009 ; Wisniewski et al., 2020 ). For example, Strambler and Weinstein (2010) observed that students who perceive teachers’ feedback as negative or unsupportive respond by devaluing the importance of school, which was negatively related to students’ academic achievement.

The types of interactions teachers have with their students can promote or inhibit student engagement in the classroom. If teachers offer challenging and fun learning activities, encourage students’ participation and provide feedback about how to reach their goals, they are promoting students’ engagement ( Pianta et al., 2012 ). Authors like Voelkl (2012) believe that school identification has its roots in earlier school grades and becomes stronger over time due to the interactions and school experiences. Consequently, if students feel accepted by their peers and supported by teachers, it is expected that they develop an identification with school. According to this author, the development of identification is mediated by contextual factors, namely perceptions of teacher support. Supportive interactions with teachers contribute to positive self-perceptions such as identification with the school, promoting student engagement with academic activities.

High-quality or effective feedback provides students with rich information about the quality of the student answer but principally about the ways to get the right answer and be sure that students use that information to promote learning. This process implies frequent exchanges of information between the student and the teacher. Teachers’ feedback to students’ responses are critical in their engagement in the learning activities ( Pianta et al., 2012 ). Therefore, supportive class environments are essential to develop and maintain students engagement. The use of high-quality feedback by the teacher over time contributes to progressively increase the sense of belongingness and the value the students attribute to school. This development of school identification can facilitate and promote students’ engagement ( Voelkl, 2012 ).

In sum, previous research suggests that students’ perceptions of teachers’ feedback play an important role in creating a supportive classroom environment ( Price et al., 2011 ; Reeve, 2012 ; Reschly and Christenson, 2012 ). Furthermore, supportive classroom environments have been found to significantly impact students’ engagement ( Voelkl, 2012 ; Allen et al., 2018 ). Therefore, we suggest that students’ shared perceptions of teachers’ use of feedback will positively influence students’ engagement and school identification.

Present Study–The Contextual Effect of Teachers’ Feedback

Previous research had explored the link between students’ individual perceptions of teachers’ feedback, students’ behavioral engagement and school identification at the individual level (e.g., Conboy et al., 2015 ; Carvalho et al., 2020 ). Results confirmed that students’ perceptions about teachers’ use of effective feedback were associated with increased behavioral engagement via increased school identification. In the present study we started by confirming that students’ individual perception of teachers’ use of effective feedback was positively related to their school identification and behavioral engagement.

The second purpose of the present study was to expand previous research by analyzing the effects of teachers’ use of effective feedback as an indicator of a supportive classroom environment that influences students’ school identification and behavioral engagement. We considered that a classroom where students shared the perception that their teachers use effective feedback frequently was a classroom with supportive environment. We hypothesized that in a supportive classroom environment students would have greater levels of school identification and behavioral engagement, even after controlling for the effect of their individual perceptions of teachers feedback (if confirmed in our first hypothesis) and after controlling other differences at the individual and at the classroom-level. This means that if two students perceived that their teacher used little effective feedback, the student that is in a classroom with a highly supportive environment will still present higher levels of behavioral engagement and school identification than the student that is in a classroom with lower supportive environment.

Previous studies have reported that when teachers’ behavior or characteristics are assessed via students’ reports, they should be studied as classroom or school level constructs from a multilevel perspective (e.g. Marsh et al., 2012 ). As a result, we implement multilevel analyses to examine the climate effect of a supportive classroom environment created by the use of effective feedback.

Climate studies evaluate whether school, classroom, or teacher characteristics contribute to predicting students’ outcomes beyond what can be explained by students’ individual characteristics ( Marsh et al., 2012 ). A climate analysis model includes the same variable at both the individual and group levels. Such analyses represent an effort to explain dependent variables (in this case, students’ school identification and behavioral engagement) using a combination of individual and group level independent variables (in this case, students’ perceptions about teachers’ use of effective feedback) ( Blalock, 1984 ). These models allow researchers to investigate the climate effects that teachers’ feedback is presumed to have on the individual students over and above the effect of any individual-level variable that may be operating ( Blalock, 1984 ).

Participant and Procedures

Data collected for this study were part of a broader research project ( Carvalho and Conboy, 2015 ), the main aim of which was to understand the dynamics of teacher feedback in developing students’ identity and the consequences of this dynamic on students’ school trajectories. This project’s target population consisted of middle school and early secondary education students from Portuguese public schools. In Portugal, basic education level is divided into three cycles: first (1st to 4th grades), second (5th to 6th grades), and third cycle (7th to 9th grades). The project focuses on students attending the transitional years between study cycles (6th, 7th, 9th, and 10th grades). In these grade levels, students have several teachers, each one teaching a different subject (Eurydice, 2019).

The sample was selected through a probabilistic, multi-stage sampling procedure in continental Portugal, based on the number of students enrolled in the chosen grades by each Territorial Unit for Purposes Statistics (NUTS II–with five regions). Schools were randomly selected for each grade level. Only one or two classrooms of the same grade were collected in each school.

The final sample consisted of 1,188 students spread over 75 classrooms in 48 schools in continental Portugal. The average number of students by classroom was 16. The sample presented similar patterns of population distribution for grade level and NUTS II region, which indicated that the sample was representative of the Portuguese population. Overall sample characteristics are illustrated in Table 1 .

www.frontiersin.org

TABLE 1 . Sample characteristics.

The students responded to a paper-and-pencil questionnaire that included a first section intended to measure students’ school identification, a second section focused on behavioral engagement and a third section that assessed student perception of teacher feedback. The instrument also included a demographic section: gender (0 = girls; 1 = boys), age, nationality (0 = Portuguese; 1 = other nationalities), year of schooling (6th, 7th, 9th or 10th grade), and mother/stepmother’s and father/stepfather’s level of education (1 = 1st cycle of basic education, 2 = 2nd cycle, 3 = 3rd cycle, 4 = secondary education, 5 = higher education).

Students’ Perceptions of Teachers’ Use of Effective Feedback

To measure students’ perceptions of their teachers’ feedback practices, we used eight items from the Teachers’ Feedback Scale, developed by Carvalho et al. (2015) . Students reported their perceptions about teachers’ use of effective feedback in a subject they like. The instruction stated, “Think of a subject that you like”. The reason for including this instruction was to avoid negative experiences associated with a discipline that could interfere with their perceptions of the feedback. The questionnaire included items questioning the feedback at the process level (e.g., “Teachers clearly describe what is not correct and make suggestions for improvement”) or self-regulation level (e.g., “The teachers ask questions that help us reflect on the quality of our work”). Items were answered on a four-point scale (0 = never and 3 = always).

To confirm that the design on the survey did not cause raters to bias their response, we assessed the common method variance (CMV) through the Harman Single Factor technique, as described by Eichhorn (2014). The common latent factor explained less than 50% of the variance (47.22%), indicating that common method bias was not present (Eichhorn, 2014). We conducted confirmatory factor analyses (CFA) to verify the measure’s structural validity in our sample, using the Weighted Least Square Mean and Variance (WLSMV) estimator. Good fit index values were adequate (χ 2 (18) = 61.30, p < 0.001; comparative fit index (CFI) = 0.992; Tucker-Lewis index (TLI) = 0.987; root mean square error of approximation (RMSEA) = 0.045, 90% IC = [0.033, 0.058], p = 0.716). The measure presented adequate levels of reliability (Composite Reliability, CR = 89) (complete results are presented in the Supplementary Material ).

Students’ perceptions of teachers’ effective feedback were aggregated at the classroom level to create a climate variable that reflects the supportive classroom environment. Climate variables are classroom aggregations of ratings by students in which each student is asked to rate a particular classroom characteristic (in this case, the frequency of effective feedback used by the teacher of the discipline they like) that is common to all students ( Marsh et al., 2012 ). Since students like different disciplines, the aggregation of the ratings provides an indicator of the frequency of effective feedback received by students during the time they are in the school. Students’ rates of teachers’ use of effective feedback were aggregated at the classroom level using a manifest measurement–latent aggregation approach ( Marsh et al., 2009 ). The manifest-latent approach uses multilevel models to aggregated student-level responses (the manifest observed variable) to form an unobserved latent variable as an indicator of the climate construct. This procedure permitted correct sampling errors in the aggregation of individual-level constructs to form classroom level climate variables ( Marsh et al., 2009 ). Hence, our supportive classroom environment construct was a latent variable at the classroom level based on shared perceptions among different students with the same teachers. Differences among students within the same classroom (the variable at the student level) do not reflect the classroom environment, representing each student’s unique perceptions that are not explained by the shared perception of different students ( Marsh et al., 2012 ). If there was no significant agreement among students from the same classroom about teachers’ use of feedback, then it could be argued that the classroom level variable did not reflect the classroom environment ( Marsh et al., 2012 ). Consequently, we test the agreement between students in the same classroom using intraclass correlation (ICC2, Lüdtke et al., 2009 ) to indicate the reliability of our classroom environment latent variable. The measure presented an ICC2 of 0.77, which falls within the acceptable threshold of 0.70 and 0.85 recommended by Lüdtke et al. (2009) .

Students’ Behavioral Engagement

A nine-item scale authored by Carvalho et al. (2016) was used to assess the behavioral engagement of the students in the school. The scale assesses two dimensions: academic work, with six items (e.g., “I study the material given in class”) and class participation, with three items (e.g., “I raise my hand to answer a question”). Students answered each on a four-point Likert scale (0 = never and 3 = always). Students were asked to think of a subject they liked. We only used the global measure composed by these dimensions.

We also assessed the CMV of this scale through the Harman Single Factor technique. There was no evidence that common method bias was present (the common latent factor explained only 39.21% of the variance). To confirm the validity of the two-dimensional hierarchical structure of the measure in our sample, we conducted a CFA using the WLSMV estimator. The results indicated that there was also evidence of structure validity (χ 2 (27) = 60.38, p = 0.002; CFI = 0.992; TLI = 0.990; RMSEA = 0.032, 90% IC = [0.021, 0.043], p = 0.996). Composite reliability was also adequate for the global measure (CR = 0.88) (complete results are presented in the Supplementary Material ).

Students’ behavioral engagement outcome variable was aggregated at the classroom level. Once again, we used the manifest-latent approach and calculated the ICC2 as an indicator of reliability ( Lüdtke et al., 2009 ; Marsh et al., 2012 ). The value of ICC2 was 0.67, just below the 0.70 value recommended by Lüdtke et al. (2009) .

Students’ School Identification

The School Identification Scale, authored by Carvalho et al. (2015) , was used to measure students’ school identification. The scale assesses three dimensions of school identification. Three items assess students’ perceptions about their school’s practical value (e.g., “My future depends on what I do in school”). Three items question their feelings of belonging and well-being in school (e.g., “I am happy in this school”). Finally, four items assess students’ perceptions of their capacity and will (e.g., “My skills make me confident about my future”). Items were answered on a four-point Likert scale (0 = completely disagree to 3 = completely agree). In the present study, we only used the global measure composed by these dimensions to avoid multicollinearity problems.

The Harman Single Factor test indicates there was no evidence that common method bias was present in this scale either (the common latent factor explained only 32.25% of the variance). We conducted a CFA to confirm the validity of the three-dimensional hierarchical structure of the measure in our sample using the WLSMV estimator. Good fit index values were adequate (χ 2 (31) = 177.35, p < 0.001; CFI = 0.969; TLI = 0.955; RMSEA = 0.063, 90% IC = [0.054, 0.072], p = 0.008). The global measure presented good levels of reliability (CR = 0.84) (complete results are presented in the Supplementary Material ).

Students’ school identification outcome variable was also aggregated at the classroom level, again using the manifest-latent approach. We tested the ICC2 ( Lüdtke et al., 2009 ) to assess the classroom-average identification level latent variable's reliability. The value of ICC2 was 0.77, indicating adequate reliability levels ( Lüdtke et al., 2009 ).

Data Analyses

All models were estimated using Mplus 8.4. Missing data (1.6% of all data) was handled by allowing missingness to be a function of the observed covariates but not the observed outcomes, the default Mplus procedure ( Muthén and Muthén, 2017 ). Factor scores of the measures were saved and used as observed manifest variables to make the models more parsimonious, reducing the number of variables involved ( Wang and Wang, 2020 ).

We employ multilevel regression modeling with random intercept and fixed slopes using the robust maximum likelihood (MLR) estimator. Respondents (level 1) were “nested” within the classroom (level 2) to account for classroom-level baselines in students’ perceptions. We ran an intercept-only model to examine ICC2 that indicated the proportion of the total variance explained by differences between schools. Next, we estimated two models to evaluate the supportive classroom environment created by teachers’ use of effective feedback. For all the models tested, the predictor variables, except the dichotomous variables, were grand-mean-centred.

In Model 1, we assess a model already tested in previous publications ( Conboy et al., 2015 ; Carvalho et al., 2020 ) based on Voelkl (2012) theory. At the individual level, students’ perceptions of teachers’ feedback contribute to students’ school identification and behavioral engagement. At the classroom level model, the supportive classroom environment contributed to students’ school identification and behavioral levels. We also propose that school identification (both at the individual and classroom levels) contribute to students’ behavioral engagement (see Figure 1 ).

www.frontiersin.org

FIGURE 1 . Conceptual model. Latent classroom-level constructs are represented as circles, and student-level indicators of these latent variables are represented as squares.

In Model 2, we incorporated the control variables. It was important to consider and neutralize individual and group variables that could explain our outcome variables (students’ engagement and school identification) ( Creswell, 2012 ). This will allow us to assess more accurately the relationship between teachers’ feedback and our outcomes because of a reduction in the number of errors ( Creswell, 2012 ). At the individual level, we control gender, mother’s and father’s education level, history of grade retention and nationality. These variables had previously been shown to be related to students’ engagement and school identification ( Allen et al., 2018 ; Bear et al., 2019 ; Cunha et al., 2019 ; Olivier et al., 2020 ). At the classroom level, we control grade level and the number of students at the grade level in the school. Previous studies indicated that the odds of a student having low levels of engagement and school identification increased in classrooms in schools with a large number of students ( Finn and Voelkl, 1993 ; Willms, 2003 ; Weiss et al., 2010 ). Moreover, students in the lower grades tend to perceive that their teachers use more effective feedback ( Carvalho et al., 2020 ) and present higher engagement levels ( Eccles et al., 1993 ; Mahatmya et al., 2012 ). We also control for classrooms in schools that were part of the Portuguese TEIP Program for priority intervention educational areas, whose aim was to promote educational inclusion in schools located in socially and economically disadvantaged areas ( European Commission, 2013 ).

Model fit was assessed using the indices and cut-off points suggested by Hu and Bentler (1999) : non-significant values of chi-square (χ 2 ) or less than three times the degrees of freedom; values higher than 0.95 of CFI and TLI; and values lower than 0.08 of RMSEA and Standardized Root Mean Square Residual (SRMR).

Preliminary Analyses

The unconditional “null” model showed that the ICC2 was between 0.111 and 0.179; in other words, between approximately 11.1 and 17.9% of the total variance in the target variables was associated with classroom characteristics (see Table 2 ). Still, the largest proportion of the variance was associated with individual characteristics. Considering that the average cluster size was 16 students, the design effects were between 2.66 and 3.68. Muthén and Satorra (1995) indicated that design effects higher than 2.00 suggest systematic variation between groups that deviate from simple random sampling. Therefore, we confirm that multilevel modeling was necessary ( Heck and Thomas, 2015 ). In Table 2 , we also present the correlation between variables at the student and classroom levels.

www.frontiersin.org

TABLE 2 . Classroom Level Intraclass Correlations (ICC) and intercorrelations at students and classroom level.

Teachers’ Feedback Effects on School Identification and Behavioral Engagement

The multilevel analysis results indicate that students’ individual perceptions about teachers’ use of effective feedback were positively related to both students’ school identification and behavioral engagement (see Model 1 in Table 3 ). Students who perceived that their teachers used more effective feedback presented a higher level of school identification and behavioral engagement. More importantly, the results indicated that, after controlling the individual effect, the supportive classroom environment had a significant effect on school identification and behavioral engagement levels. These results indicated that students in classrooms where teachers used more effective feedback, thus creating a supportive classroom environment, had higher levels of school identification and behavioral engagement, regardless of their individual perceptions of teachers’ use of effective feedback.

www.frontiersin.org

TABLE 3 . Coefficients of the multilevel models tested.

Students’ school identification also predicted students’ behavioral engagement, but only at the individual level. We observed that students in classrooms with more students with higher school identification levels did not present higher engagement levels as expected. Indeed, individual levels of school identification were more relevant in explaining students’ behavioral engagement.

In model 2, we added the control variables at the individual level (gender, nationality, mother’s education level and history of grade retention) and classroom level (grade-level, number of students in the grade-level, TEIP school). To make the model parsimonious, we removed all non-significant paths that did not affect the fit or the predictive power of the model. The final model results are presented in Table 3 and Figure 2 .

www.frontiersin.org

FIGURE 2 . Standardize coefficients of the multilevel Model 2 tested (with MLR estimator). Latent classroom-level constructs are represented as circles and student-level indicators of these latent variables are represented as squares. Dotted lines represent non-significant relations.

At the individual level, besides students’ perceptions of teachers’ feedback, mother’s education level also contributed to students’ school identification and behavioral engagement. The fathers’ educational level only contributed to students’ school identification but not to their behavioral engagement. Gender explained students’ behavioral engagement and school identification, while grade retention explained only school identification. Male students, non-retained students, students whose mother and father had a higher level of education and students who perceived that their teachers used more effective feedback presented higher school identification levels. Female students, students whose mother had a higher level of education, students with a higher level of school identification and students who perceived that their teachers used more effective feedback had higher behavioral engagement levels. Students’ nationality was not related to any variable under study.

Results also indicated that students’ perception of teachers’ feedback was related to students’ history of grade retention. Retained students perceived that their teachers used less effective feedback than non-retained students. Despite this, the relation was very week.

At the classroom level, we observed that the classroom environment effect on school identification and behavioral engagement levels remained significant, with considerable size effects. Students in classrooms with higher levels of supportive environments (i.e., where teachers used more effective feedback) had higher school identification and behavioral engagement levels. Additionally, students in classrooms from schools with fewer students also had higher school identification and behavioral engagement levels. The number of students was not related to the supportive classroom environment.

Table 3 shows that students in classes from lower grade levels presented higher levels of school identification and indicated a more supportive environment where teachers used more effective feedback. There was a less supportive environment in classrooms from higher grade levels. Belonging to a TEIP school was not related to the supportive classroom environment, school identification or behavioral engagement levels.

The final model presented very good indicators of model fit: χ 2 (8) = 6.53, p = 0.588; CFI = 1.000; TLI = 1.004; RMSEA <0.001; SRMR = 0.009 (within), 0.029 (between). The model clearly explained the variance, both at the individual and classroom levels, in students’ behavioral engagement (37.3 and 75.0%, respectively) and school identification (19.6 and 52.8%, respectively). Teachers’ feedback variance is only distinctly explained by the variables at the classroom level (1.1 and 49.2%, respectively).

In this study, we aimed to understand if a supportive learning environment generated by teachers’ use of effective feedback can boost students’ school identification and behavioral engagement. We used teachers’ feedback as an indicator of a supportive classroom environment. Our results confirm previous studies that indicated that students’ perceptions about teacher feedback are positively related with their school identification and behavioral engagement (e.g., Koka and Hein, 2005 , Koka and Hein, 2006 ; Leh et al., 2014 ; Conboy et al., 2015 ; Vattøy and Smith, 2019 ; Carvalho et al., 2020 ; Wang and Zhang, 2020 ). The feedback directly experienced by students enhance their sense of autonomy and self-efficacy by offering information about where they are going, how they are going there and how to reach their goals ( Hattie, 2009 ; Wisniewski et al., 2020 ). Therefore, by offering effective feedback, the teacher is communicating to the student (and, by extension, to all students in the classroom) that learning is essential and relevant to students’ personal goals (where they are going), that they can succeed and are valued by the teacher (by caring about how they are going) and informing them about the behaviors they need to exhibit to better meet expectations in the future (where to next). In other words, effective feedback reinforces the value of school for the students, their feelings of belongingness and their behavioral engagement in school activities, avoiding dropout and social exclusion.

Our results also indicated that other individual variables like mother’s and father’s educational level, gender and grade retention were related to students’ school identification and behavioral engagement, all of which is consistent with previous studies (e.g., Allen et al., 2018 ; Bear et al., 2019 ; Cunha et al., 2019 ; Olivier et al., 2020 ). We found that mothers’ educational level was positively related to students’ school identification and behavioral engagement levels, while the fathers’ educational level was only positively related to students’ school identification. Parents’ educational attitudes and beliefs are considered to be significant influences on their children’s educational attitudes. Mothers with higher education levels are more intellectually involved in school activities, providing intellectual resources and helping with schoolwork, thus creating a positive environment in which students develop their school identification and behavioral engagement ( Bempechat and Shernoff, 2012 ). According to Vieira (2013) , recent research on families and family dynamics in Portugal confirm that mainly mothers are the ones that help children with schoolwork, take them to school, and talk with them about school and their studies. Therefore, behavioral engagement seems to be more affected by the mothers’ level of education than for the fathers’ level of education.

Previous studies have also indicated that female students score higher in all engagement dimensions, especially in the behavioral engagement (e.g., Lietaert et al., 2015 ). Still, our results were not completely consistent with previous studies. In the present study, girls presented higher levels of behavioral engagement, as expected, while males presented higher levels of school identification. It is possible that our results differ from previous research because of the dimensions that were assessed by the school identification measure. Research indicates that males have higher levels of academic self-efficacy (e.g., Huang, 2013 ). In the present study, the school identification measure included a dimension that assesses students’ perceptions of their capacity and will, which contribute greatly to the school identification latent factor (see Supplementary Figure S2 in the Supplementary Material). Therefore, males’ higher levels of school identification might be related to a higher sense of self-efficacy.

Researchers suggest that girls scored higher in their behavioral engagement because activities are focused on language and verbal learning, competences stereotypically related to girls ( Lietaert et al., 2015 ). Still, Lietaert et al. (2015) observed in their study that teachers offered less support to male students, which was related to their lower engagement compared to girls. Authors suggest that teachers offer more support to girls because they are less tolerant of negative behaviors from boys. In contrast, they associated more positive behaviors (more compliance, better organization skills, etc.) to girls. Portuguese teachers also described boys as being disconnected and irresponsible and girls as more focused and responsible ( Wall et al., 2017 ). Although we only find a marginally significant effect of gender on students’ perception of teachers’ feedback, our results seem to correspond to Lietaert et al. (2015) findings: boys perceive that teachers used less effective feedback than girls. Nevertheless, given that the gender bias in feedback observed in this study was small, it could be argued that these students believe that their teachers do not make much difference between genders in the use of effective feedback. Additionally, Lietaert et al. (2015) indicated that it is possible that these lower levels of engagement from boys could explain why teachers are less supportive. Consequently, future research may need to consider the reciprocal effect between teacher support through effective feedback and engagement to better understand this classroom dynamic.

Regarding the effect of grade retention, previous studies indicated that retention seems to leave a significant mark on students that lead them to develop a more negative attitude toward school, associating school with negative experiences (e.g., Martin, 2011 , Santos et al., submitted). A highly interesting finding from our study was that retained students also perceived that their teachers used less effective feedback. Although the effect was small, it could indicate that teachers had lower expectations about retained students (e.g., OECD, 2012 ), using less effective feedback with students with lower achievements. For example, Gentrup et al. (2020) observed that teachers communicate their expectations through different feedback practices, giving less positive feedback and more negative feedback to students for whom they have low expectations. Furthermore, Monteiro et al. (2019) observed that teachers gave less feedback at the process and self-regulation levels to students with low achievement. In Santana’s study ( Santana, 2019 ), a number of Portuguese school directors admitted that teachers tend to ignore retained students.

However, the central question of the present study is whether the aggregated classroom characteristic of teachers’ feedback, as a measure of a supportive classroom environment, affects students’ school identification and behavioral engagement after controlling for other inter-individual differences at the individual level. The answer was positive: Our findings demonstrated that the supportive environment created by teachers’ feedback varies across schools and that students in classrooms where, on average, teachers used more effective feedback and created a supportive classroom environment, had higher levels of school identification and behavioral engagement than students in classrooms without this supportive environment. This was true regardless of students’ individual perceptions about teachers’ feedback.

These findings represent a significant contribution to the theoretical discussion about perceived feedback. Even if a student perceived that his/her teachers used little effective feedback, if s/he was in a classroom with a highly supportive environment, the student would have higher levels of school identification and behavioral engagement than if s/he was in a classroom with a less supportive environment. These results suggest that feedback interactions may affect the learning and engagement of other students in the classroom because they are exposed to both their peers’ behavior and performance and teachers’ feedback to their peers, as observed by Conroy et al. (2009) . By contrast, if teachers display differential feedback for some students based on their individual characteristics (gender, nationality or achievement levels), creating an unsupportive classroom environment, students’ trust for or receptivity to the teacher as a source of support and feeling of belonging will be reduced ( Voelkl, 2012 ). Therefore, similarly to what Hattie and Timperley (2007) and Wisniewski et al. (2020) have said, the feedback has huge power. The use (or not) of effective feedback can have an overall impact on the classroom environment and climate, increasing (or decreasing) students’ engagement and school identification. This conclusion deserves to be studied more deeply in future research because students were asked to think about a discipline they liked. In a discipline where students have negative experiences, this impact can have other effects on their behavioral engagement and identification.

More importantly, the quality of the supportive classroom environment was not related to the number of students in the school, suggesting that teachers in schools with both large and small numbers of students were able to create a supportive environment using effective feedback. Nevertheless, our results indicated that students in classrooms from schools with fewer students had higher levels of school identification and behavioral engagement, which is coherent with previous studies ( Finn and Voelkl, 1993 ; Willms, 2003 ; Weiss et al., 2010 ). However, the size effect of the number of students in the school is smaller than the effect of the supportive classroom environment. As a result, our findings indicated that the classroom environment is more critical than the number of students in the school to predict students’ engagement. As mentioned by Weiss et al. (2010) , the reduction of the number of students in a class does not guarantee that students will experience the same benefits as students in smaller schools, especially if their feedback environment is not adequate to prompt students’ engagement.

At the classroom level, the results indicated that students in classes from lower grade levels presented higher school identification levels. This was expected since the literature indicates that affective engagement tends to decrease upon the transition to adolescence ( Eccles et al., 1993 ; Mahatmya et al., 2012 ). Eccles et al. (1993) suggested that the mismatch between the needs of developing adolescences and the opportunities afforded by their social environments could decrease their school identification. Our results are consistent with this theory since we observed a less supportive classroom environment in classrooms from higher grade levels. Therefore, students in higher grades perceive that their teachers offer less feedback than what they feel they need. It could be that students in lower grade levels perceive a more supportive environment because in 6th and 7th grades teachers focused more on mastery than on performance ( Guo, 2020 ). Teachers’ primary goal in the middle school years is to help students master certain knowledge and skills to prepare for secondary school. Consequently, they may provide more feedback at the process and self-regulation levels to enhance learning habits and abilities ( Guo, 2020 ). On the contrary, in the higher grade levels, teachers may focus most on providing correct answers or solutions to students because teachers are more focused on performance to help students prepare for their upcoming examinations ( Guo, 2020 ). Future studies could test this hypothesis by assessing both the supportive classroom environment and teachers’ goals and beliefs about feedback in several grade levels.

Although the present investigation contributes to the theory about feedback and students’ engagement by accounting for both individual and classroom factors that may impact students’ schooling experience, it has some important limitations to consider for future research. For the assessment of the supportive classroom environment, we relied on the perception that students have of their teachers’ feedback practices. Although classroom-level aggregated measures of students’ perceptions are reliable indicators of a learning environment ( Marsh et al., 2012 ), using classroom observations, interviews, and teacher reports would have provided a complementary evaluation of teachers’ feedback practices. Additionally, we used a manifest-latent approach to aggregate the classroom level variable. Although it controlled sampling errors at the classroom level, we did not control measurement errors at the individual level. A doubly latent approach was necessary to control both types of errors ( Marsh et al., 2009 ). Unfortunately, double latent models required a larger sample than used in the present study ( Marsh et al., 2009 ). Future studies should replicate this research using a larger size sample, with a higher number of classrooms. Another limitation of this research was the use of a single research method. To produce in-depth and richer information in order to better understand the relations between the variables of the problem being studied, it would be better to have leaned toward a mixed methods approach to get potentialities that together give a more precise view of the role of feedback on school identification and student engagement ( Hughes, 2016 ).

Despite these limitations, in the present study we found evidence that indicated that students’ perceptions of teachers’ feedback and the classroom environment created by the effective feedback were more critical for explaining students’ school identification and behavioral engagement than their individual characteristics (like mother’s level of education, gender and grade retention) or the size of the school. Therefore, improving teachers’ use of effective feedback, especially in the upper grade levels, could impact students’ engagement levels. It is essential to provide teachers with training that focuses on giving effective and high-quality feedback. This training should enable teachers to provide a supportive environment to all their students, independent of their gender or what achievements they expect from their students, based on their school trajectory.

Being competent as a teacher to develop a supportive environment in a classroom also means being alert to the expectations constructed around the students. If teachers hold different expectations based on achievement levels or gender, this can influence students’ trust to see teachers as a source of support or as a person that allows her or him to develop feelings of school belonging ( Voelkl, 2012 ).

The evidence suggests that teachers can improve school identification and behavioral engagement by using effective feedback. If there is a supportive environment developed by teachers in the classroom through the use of effective feedback, students will learn better and achieve their psychological and social goals. A supportive environment will motivate the students to communicate with their teachers and peers, to engage in different activities and forms of learning environments and to increase their sense of school identification.

Data Availability Statement

The data analyzed in this study is subject to the following licenses/restrictions: The datasets generated for this study are available on request. Requests to access these datasets should be directed to Carolina Carvalho, [email protected].

Ethics Statement

The studies involving human participants were reviewed and approved by the Comissão de Ética (CdE) of the Instituto de Educação (IE) da Universidade de Lisboa. Written informed consent to participate in this study was provided by the participants’ legal guardian/next of kin.

Author Contributions

All authors listed have made a substantial, direct, and intellectual contribution to the work and approved it for publication.

This study was supported by the FCT–Science and Technology Foundation–UIDP/04853/2020.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/feduc.2021.661736/full#supplementary-material

Adomnik, J. G. (2012). The Effects of Self-Determination, Identification with School, and School Climate on Middle School Students’ Aspirations for Future educationDoctoral Dissertation . Tuscaloosa: Graduate School of the University of Alabama Available at: https://ir.ua.edu/bitstream/handle/123456789/1402/file_1.pdf?sequence=1&isAllowed=y .

Allen, K., Kern, M. L., Vella-Brodrick, D., Hattie, J., and Waters, L. (2018). What Schools Need to Know about Fostering School Belonging: A Meta-Analysis. Educ. Psychol. Rev. 30, 1–34. doi:10.1007/s10648-016-9389-8

CrossRef Full Text | Google Scholar

Appleton, J. J., Christenson, S. L., Kim, D., and Reschly, A. L. (2006). Measuring Cognitive and Psychological Engagement: Validation of the Student Engagement Instrument. J. Sch. Psychol. 44, 427–445. doi:10.1016/j.jsp.2006.04.002

Bear, G. G., Harris, A., Saraiva de Macedo Lisboa, C., and Holst, B. (2019). Perceptions of Engagement and School Climate: Differences between Once-Retained and Multiple-Retained Students in Brazil. Int. J. Sch. Educ. Psychol. 7 (1), 18–27. doi:10.1080/21683603.2017.1376725

Bempechat, J., and Shernoff, D. J. (2012). “Parental Influences on Achievement Motivation and Student Engagement,” in Handbook of Research on Student Engagement . Editors S. L. Christenson, A. L. Reschly, and C. Wylie (London: Springer ), 315–342. doi:10.1007/978-1-4614-2018-7_15

Black, P., Harrison, C., Lee, C., Marshall, B., and Wiliam, D. (2004). Working inside the Black Box: Assessment for Learning in the Classroom . London: GL Assessment 86, 8–21. doi:10.1177/003172170408600105

CrossRef Full Text

Black, P., and Wiliam, D. (2010). Inside the Black Box: Raising Standards through Classroom Assessment. Phi Delta Kappan 92 (2), 81–90. doi:10.1177/003172171009200119

Blalock, H. M. (1984). Contextual-Effects Models: Theoretical and Methodological Issues. Annu. Rev. Sociol. 10, 353–372. doi:10.1146/annurev.so.10.080184.002033

Bond, M., Buntins, K., Bedenlier, S., Zawacki-Richter, O., and Kerres, M. (2020). Mapping Research in Student Engagement and Educational Technology in Higher Education: A Systematic Evidence Map. Int. J. Educ. Technol. High Educ. 17, 2. doi:10.1186/s41239-019-0176-8

Burnett, P. C. (2002). Teacher Praise and Feedback and Students' Perceptions of the Classroom Environment. Educ. Psychol. 22 (1), 5–16. doi:10.1080/01443410120101215

Burns, E. C., Martin, A. J., and Collie, R. J. (2019). Examining the Yields of Growth Feedback from Science Teachers and Students' Intrinsic Valuing of Science: Implications for Student‐ and School‐level Science Achievement. J. Res. Sci. Teach. 56 (8), 1060–1082. doi:10.1002/tea.21546

Carvalho, C., and Conboy, J. (2015). Feedback, identidade, trajetórias escolares: Dinâmicas e consequências [Feedback, identity, school trajectories: Dynamics and consequences] . Lisbon: Instituto de Educação da Universidade de Lisboa

Carvalho, C., Conboy, J., Santos, J., Fonseca, J., Tavares, D., Martins, D., et al. (2015). An Integrated Measure of Student Perceptions of Feedback, Engagement and School Identification. Proced. - Soc. Behav. Sci. 174, 2335–2342. doi:10.1016/j.sbspro.2015.01.896

Carvalho, C., Conboy, J., Santos, J., Fonseca, J., Tavares, D., Martins, D., et al. (2017). Escala de Perceção dos Alunos sobre o seu Envolvimento Comportamental Escolar: Construção e Validação. Psic.: Teor. e Pesq. 32 (3), e323219. doi:10.1590/0102-3772e323219

Carvalho, C., Santos, N. N., António, R., and Martins, D. S. M. (2020). Supporting Students' Engagement with Teachers' Feedback: the Role of Students' School Identification. Educ. Psychol. , 1–20. doi:10.1080/01443410.2020.1849564

Christenson, S. L., Reschly, A. L., Appleton, J. J., Berman, S., Spanjers, D., and Varro, P. (2008). “Best Practices in Fostering Student Engagement,” in Best Practices in School Psychology V . Editors A. Thomas, and J. Grimes (Bethesda, MD: National Association of School Psychologists ), 1099–1120.

Google Scholar

Conboy, J., Caravalho, C., and Santos, J. (2015). “Feedback, identificação, envolvimento: Construção de um modelo explicativo,” in Feedback, identidade, trajetórias escolares: Dinâmicas e consequências . Editors C. Carvalho, and J. Conboy (Lisbon: Instituto de Educação da Universidade de Lisboa ), 83–108.

Conroy, M. A., Sutherland, K. S., Snyder, A., Al-Hendawi, M., and Vo, A. (2009). Creating a Positive Classroom Atmosphere: Teachers’ Use of Effective Praise and Feedback. Beyond Behav. 18, 18–26. Available at: https://johnston1025.files.wordpress.com/2016/01/creating-a-positive-classroom-environment.pdf .

Creswell, J. (2012). Educational Research. Planning, Conducting and Evaluating Quantitative and Qualitative Research . Boston: Pearson .

Cunha, J., Rosário, P., Núñez, J. C., Vallejo, G., Martins, J., and Högemann, J. (2019). Does Teacher Homework Feedback Matter to 6th Graders' School Engagement? a Mixed Methods Study. Metacognition Learn. 14 (2), 89–129. doi:10.1007/s11409-019-09200-z

Eccles, J. S., Midgley, C., Wigfield, A., Buchanan, C. M., Reuman, D., Flanagan, C., et al. (1993). Development during Adolescence: The Impact of Stage-Environment Fit on Young Adolescents' Experiences in Schools and in Families. Am. Psychol. 48, 90–101. doi:10.1037//0003-066x.48.2.9010.1037/0003-066x.48.2.90

PubMed Abstract | CrossRef Full Text | Google Scholar

European Commission (2013). Reducing Early School Leaving: Key Messages and Policy Support. Final Report of the Thematic Working Group on Early School Leaving. Available at: https://ec.europa.eu/education/sites/education/files/early-school-leaving-group2013-report_en.pdf . (Accessed November 20, 2020).

Finn, J. D., and Voelkl, K. E. (1993). School Characteristics Related to Student Engagement. J. Negro Edu. 62 (3), 249–268. doi:10.2307/2295464 Available at: https://www.jstor.org/stable/2295464

Finn, J. D., and Zimmer, K. S. (2012). “Student Engagement: What Is it? Why Does it Matter?,” in Handbook of Research on Student Engagement . Editors S. L. Christenson, A. L. Reschly, and C. Wylie (London: Springer ), 97–131. doi:10.1007/978-1-4614-2018-7_5

Fredricks, J., McCloskey, W., Meli, L., Mordica, J., Montrose, B., and Mooney, K. (2011). Measuring Student Engagement in Upper Elementary School through High School: A Description of 21 Instruments (Issues and Answers Report, REL 2011 No. 098). Available at: https://files.eric.ed.gov/fulltext/ED514996.pdf .

Gentrup, S., Lorenz, G., Kristen, C., and Kogan, I. (2020). Self-fulfilling Prophecies in the Classroom: Teacher Expectations, Teacher Feedback and Student Achievement. Learn. Instruction 66, 101296. doi:10.1016/j.learninstruc.2019.101296

Gettinger, M., and Ball, C. (2007). “Best Practices in Increasing Academic Engaged Time,” in Best Practices in School Psychology V . Editors A. Thomas, and J. Grimes (Bethesda, MD: National Association of School Psychologists ), 1043–1075.

Guo, W. (2020). Grade-Level Differences in Teacher Feedback and Students' Self-Regulated Learning. Front. Psychol. 11, 783. doi:10.3389/fpsyg.2020.00783

Gutierrez, A. S., and Buckley, K. H. (2019). Stories from the Field: Building strong Teacher-Students Relationships in the Classroom. Transform. Educ (1). Available at: https://files.eric.ed.gov/fulltext/ED601206.pdf .

Handley, K., Price, M., and Millar, J. (2011). Beyond 'doing Time': Investigating the Concept of Student Engagement with Feedback. Oxford Rev. Edu. 37 (4), 543–560. doi:10.1080/03054985.2011.604951

Hattie, J., and Timperley, H. (2007). The Power of Feedback. Rev. Educ. Res. 77 (1), 81–112. doi:10.3102/003465430298487

Hattie, J. (2009). Visible Learning: A Synthesis of over 800 Meta-Analyses Relating to Achievement . London: Routledge .

Hattie, J., and Yates, G. (2014). Visible Learning and the Science of How We Learn . New York, NY: Routledge .

Heck, R. H., and Thomas, S. L. (2015). An Introduction to Multilevel Modeling techniquesMLM and SEM Approaches Using Mplus . London: Routledge .

Hu, L. t., and Bentler, P. M. (1999). Cutoff Criteria for Fit Indexes in Covariance Structure Analysis: Conventional Criteria versus New Alternatives. Struct. Equation Model. A Multidisciplinary J. 6 (1), 1–55. doi:10.1080/10705519909540118

Huang, C. (2013). Gender Differences in Academic Self-Efficacy: a Meta-Analysis. Eur. J. Psychol. Educ. 28, 1–35. doi:10.1007/s10212-011-0097-y

Hughes, A. S. (2016). Mixed Methods Research. Observer 29 (5). Available at: https://www.psychologicalscience.org/observer/mixed-methods-research (Accessed November 20, 2020).

Janosz, M. (2012). “Part IV Commentary: Outcomes of Engagement and Engagement as an Outcome: Some Consensus, Divergences, and Unanswered Questions,” in Handbook of Research on Student Engagement . Editors S. L. Christenson, A. L. Reschly, and C. Wylie (London: Springer ), 695–703. doi:10.1007/978-1-4614-2018-7_33

Koka, A., and Hein, V. (2006). Perceptions of Teachers' Positive Feedback and Perceived Threat to Sense of Self in Physical Education: a Longitudinal Study. Eur. Phys. Edu. Rev. 12 (2), 165–179. doi:10.1177/1356336X06065180

Koka, A., and Hein, V. (2005). The Effect of Perceived Teacher Feedback on Intrinsic Motivation in Physical Education. Int. J. Sport Psychol. 36 (2), 91–106.

Korpershoek, H., Canrinus, E. T., Fokkens-Bruinsma, M., and de Boer, H. (2019). The Relationships between School Belonging and Students' Motivational, Social-Emotional, Behavioural, and Academic Outcomes in Secondary Education: a Meta-Analytic Review. Res. Pap. Edu. 35, 641–680. doi:10.1080/02671522.2019.1615116

Kyaruzi, F., Strijbos, J.-W., Ufer, S., and Brown, G. T. L. (2019). Students' Formative Assessment Perceptions, Feedback Use and Mathematics Performance in Secondary Schools in Tanzania. Assess. Educ. Principles, Pol. Pract. 26 (3), 278–302. doi:10.1080/0969594X.2019.1593103

Leh, L. Y., Abdullah, A. G., and Ismail, A. (2014). The Influence of Feedback Environment towards Self-Efficacy for Students Engagement, Classroom Management and Teaching Strategies. Int. J. Manag. Sci. 4 (6), 253–260. doi:10.5296/jse.v4i4.6456

Lietaert, S., Roorda, D., Laevers, F., Verschueren, K., and De Fraine, B. (2015). The Gender gap in Student Engagement: The Role of Teachers' Autonomy Support, Structure, and Involvement. Br. J. Educ. Psychol. 85, 498–518. doi:10.1111/bjep.12095

Lovelace, M. D., Reschly, A. L., and Appleton, J. J. (2017). Beyond School Records: The Value of Cognitive and Affective Engagement in Predicting Dropout and On-Time Graduation. Prof. Sch. Couns. 21, 70–84. Available at: https://journals.sagepub.com/doi/10.5330/1096-2409-21.1.70 doi:10.5330/1096-2409-21.1.70

Lovelace, M. D., Reschly, A. L., Appleton, J. J., and Lutz, M. E. (2014). Concurrent and Predictive Validity of the Student Engagement Instrument. J . Psychoeducational Assess . 32 (6), 509–520. doi:10.1177/0734282914527548

Lüdtke, O., Robitzsch, A., Trautwein, U., and Kunter, M. (2009). Assessing the Impact of Learning Environments: How to Use Student Ratings of Classroom or School Characteristics in Multilevel Modeling. Contemp. Educ. Psychol. 34 (2), 120–131. doi:10.1016/j.cedpsych.2008.12.001

Mahatmya, D., Lohman, B. J., Matjasko, J. L., and Farb, A. F. (2012). “Engagement across Developmental Periods,” in Handbook of Research on Student Engagement . Editors S. L. Christenson, A. L. Reschly, and C. Wylie (London: Springer ), 45–63. doi:10.1007/978-1-4614-2018-7_3

Marsh, H. W., Lüdtke, O., Nagengast, B., Trautwein, U., Morin, A. J. S., Abduljabbar, A. S., et al. (2012). Classroom Climate and Contextual Effects: Conceptual and Methodological Issues in the Evaluation of Group-Level Effects. Educ. Psychol. 47 (2), 106–124. doi:10.1080/00461520.2012.670488

Marsh, H. W., Lüdtke, O., Robitzsch, A., Trautwein, U., Asparouhov, T., Muthén, B., et al. (2009). Doubly-latent Models of School Contextual Effects: Integrating Multilevel and Structural Equation Approaches to Control Measurement and Sampling Error. Multivariate Behav. Res. 44 (6), 764–802. doi:10.1080/00273170903333665

Martin, A. J. (2011). Holding Back and Holding behind: Grade Retention and Students' Non-academic and Academic Outcomes. Br. Educ. Res. J. 37, 739–763. doi:10.1080/01411926.2010.490874

Monteiro, V., Mata, L., Santos, N., Sanches, C., and Gomes, M. (2019). Classroom Talk: The Ubiquity of Feedback. Front. Educ. 4, 140. doi:10.3389/feduc.2019.00140

Muthén, B. O., and Satorra, A. (1995). Complex Sample Data in Structural Equation Modeling. Sociological Methodol. 25, 267–316. doi:10.2307/271070

Muthén, L. K., and Muthén, B. O. (2017). Mplus Statistical Analysis with Latent Variables. Mplus User’s Guide . 8th Edition. Los Angeles, CA: Muthén & Muthén .

OECD (2012). Equity and Quality in Education: Supporting Disadvantaged Students and Schools . Paris: Author . doi:10.1787/9789264130852-en

Olivier, E., Galand, B., Hospel, V., and Dellisse, S. (2020). Understanding Behavioural Engagement and Achievement: The Roles of Teaching Practices and Student Sense of Competence and Task Value. Br. J. Educ. Psychol. 90, 887–909. doi:10.1111/bjep.12342

Pianta, R. C., Hamre, B. K., and Allen, J. P. (2012). “Teacher-Student Relationships and Engagement: Conceptualizing, Measuring, and Improving the Capacity of Classroom Interactions,” in Handbook of Research on Student Engagement . Editors S. L. Christenson, A. L. Reschly, and C. Wylie (London: Springer ), 365–386. doi:10.1007/978-1-4614-2018-7_17

Price, M., Handley, K., and Millar, J. (2011). Feedback: Focusing Attention on Engagement. Stud. Higher Edu. 36 (8), 879–896. doi:10.1080/03075079.2010.483513

Reeve, J. (2012). “A Self-Determination Theory Perspective on Student Engagement,” in Handbook of Research on Student Engagement . Editors S. L. Christenson, A. L. Reschly, and C. Wylie (London: Springer ), 149–172. doi:10.1007/978-1-4614-2018-7_7

Reschly, A. L., and Christenson, S. L. (2012). “Jingle, Jangle, and Conceptual Haziness: Evolution and Future Directions of the Engagement Construct,” in Handbook of Research on Student Engagement . Editors S. L. Christenson, A. L. Reschly, and C. Wylie (London: Springer ), 3–19. doi:10.1007/978-1-4614-2018-7_1

Reschly, A. L., and Christenson, S. L. (2006). Prediction of Dropout Among Students with Mild Disabilities. Remedial Spec. Edu. 27 (5), 276–292. doi:10.1177/07419325060270050301

Santana, M. R. R. (2019). Práticas e representações acerca da retenção escolar [Practices and representations about school retention] (Doctoral dissertation) . Lisbon, Portugal : Universidade Nova de Lisboa. Available at: http://hdl.handle.net/10362/89715 .

Santos, N. N., Monteiro, V., and Carvalho, C. (2021). The Impact of Grade Retention and School Engagement on Students’ Intention to Enrol in Higher Education . Lisboa, Portugal: Centro de Investigação em Educação, ISPA - Instituto Universitário; UIDEF, Instituto de Educação da Universidade de Lisboa .

Strambler, M. J., and Weinstein, R. S. (2010). Psychological Disengagement in Elementary School Among Ethnic Minority Students. J. Appl. Develop. Psychol. 31, 155–165. doi:10.1016/j.appdev.2009.11.006

Valente, M. O., Conboy, J., and Carvalho, C. (2015). “Teacher Communication of Evaluation Results: Impact on Students’ Engagement in School,” in Feedback, identificade, trajetórias escolares: Dinâmicas e consequências [School trajectories: Dynamics and consequences] . Editors C. Carvalho, and J. Comboy (Lisbon: Instituto de Educação da Universidade de Lisboa ), 13–30.

Vattøy, K.-D., and Smith, K. (2019). Students' Perceptions of Teachers' Feedback Practice in Teaching English as a Foreign Language. Teach. Teach. Edu. 85, 260–268. doi:10.1016/j.tate.2019.06.024

Vieira, M. M. (2013). “Pais desorientados? O apoio à escolha vocacional dos filhos em contextos de incerteza [Disoriented parents? Support for the vocational choice of children in contexts of uncertainty],” in Habitar a escola e as suas margens Geografia Plurais em Confronto . Editors M. M. Vieira, J. Resende, M. A. Nogueira, J. Dayrell, A. Martins, A. Calhaet al. (Portalegre: Instituto Politécnico de Portalegre - Escola Superior de Educação ), 51–64.

Voelkl, K. E. (1996). Measuring Students' Identification with School. Educ. Psychol. Meas. 56, 760–770. doi:10.1177/0013164496056005003

Voelkl, K. E. (2012). “School Identification,” in Handbook of Research on Student Engagement . Editors S. L. Christenson, A. L. Reschly, and C. Wylie (London: Springer ), 193–218. doi:10.1007/978-1-4614-2018-7_9

Wall, K., Cunha, V., Atalaia, S., Rodrigues, L., Correia, R., Correia, S. V., et al. (2017). White Paper. Men and Gender equality in Portugal . Lisbon: Institute of Social Sciences of the University of Lisbon .

Wang, J., and Wang, X. (2020). Structural Equation Modeling. Applications Using Mplus . 2nd Edition. Oxford, United Kingdom: Wiley .

Wang, S., and Zhang, D. (2020). Perceived Teacher Feedback and Academic Performance: the Mediating Effect of Learning Engagement and Moderating Effect of Assessment Characteristics. Assess. Eval. Higher Edu. 45, 973–987. doi:10.1080/02602938.2020.1718599

Weiss, C. C., Carolan, B. V., and Baker-Smith, E. C. (2010). Big School, Small School: (Re)testing Assumptions about High School Size, School Engagement and Mathematics Achievement. J. Youth Adolescence 39, 163–176. doi:10.1007/s10964-009-9402-3

Willms, J. D. (2003). Students Engagement at School. A Sense of Belonging and Participation. Results from PISA 2000 . Paris, France: OECD publishers . doi:10.4324/9780203299456

Wisniewski, B., Zierer, K., and Hattie, J. (2020). The Power of Feedback Revisited: A Meta-Analysis of Educational Feedback Research. Front. Psychol. 10, 3087. doi:10.3389/fpsyg.2019.03087

Keywords: teachers’ feedback practices, school identification, behavioral engagement, supportive classrooms, multilevel analisys, middle school, secondary school

Citation: Monteiro V, Carvalho C and Santos NN (2021) Creating a Supportive Classroom Environment Through Effective Feedback: Effects on Students’ School Identification and Behavioral Engagement. Front. Educ. 6:661736. doi: 10.3389/feduc.2021.661736

Received: 31 January 2021; Accepted: 11 June 2021; Published: 25 June 2021.

Reviewed by:

Copyright © 2021 Monteiro, Carvalho and Santos. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Natalie Nóbrega Santos, [email protected]

This article is part of the Research Topic

The Role of Teacher Interpersonal Variables in Students’ Academic Engagement, Success, and Motivation

  • Research article
  • Open access
  • Published: 01 October 2021

Adaptive e-learning environment based on learning styles and its impact on development students' engagement

  • Hassan A. El-Sabagh   ORCID: orcid.org/0000-0001-5463-5982 1 , 2  

International Journal of Educational Technology in Higher Education volume  18 , Article number:  53 ( 2021 ) Cite this article

62k Accesses

61 Citations

27 Altmetric

Metrics details

Adaptive e-learning is viewed as stimulation to support learning and improve student engagement, so designing appropriate adaptive e-learning environments contributes to personalizing instruction to reinforce learning outcomes. The purpose of this paper is to design an adaptive e-learning environment based on students' learning styles and study the impact of the adaptive e-learning environment on students’ engagement. This research attempts as well to outline and compare the proposed adaptive e-learning environment with a conventional e-learning approach. The paper is based on mixed research methods that were used to study the impact as follows: Development method is used in designing the adaptive e-learning environment, a quasi-experimental research design for conducting the research experiment. The student engagement scale is used to measure the following affective and behavioral factors of engagement (skills, participation/interaction, performance, emotional). The results revealed that the experimental group is statistically significantly higher than those in the control group. These experimental results imply the potential of an adaptive e-learning environment to engage students towards learning. Several practical recommendations forward from this paper: how to design a base for adaptive e-learning based on the learning styles and their implementation; how to increase the impact of adaptive e-learning in education; how to raise cost efficiency of education. The proposed adaptive e-learning approach and the results can help e-learning institutes in designing and developing more customized and adaptive e-learning environments to reinforce student engagement.

Introduction

In recent years, educational technology has advanced at a rapid rate. Once learning experiences are customized, e-learning content becomes richer and more diverse (El-Sabagh & Hamed, 2020 ; Yang et al., 2013 ). E-learning produces constructive learning outcomes, as it allows students to actively participate in learning at anytime and anyplace (Chen et al., 2010 ; Lee et al., 2019 ). Recently, adaptive e-learning has become an approach that is widely implemented by higher education institutions. The adaptive e-learning environment (ALE) is an emerging research field that deals with the development approach to fulfill students' learning styles by adapting the learning environment within the learning management system "LMS" to change the concept of delivering e-content. Adaptive e-learning is a learning process in which the content is taught or adapted based on the responses of the students' learning styles or preferences. (Normadhi et al., 2019 ; Oxman & Wong, 2014 ). By offering customized content, adaptive e-learning environments improve the quality of online learning. The customized environment should be adaptable based on the needs and learning styles of each student in the same course. (Franzoni & Assar, 2009 ; Kolekar et al., 2017 ). Adaptive e-learning changes the level of instruction dynamically based on student learning styles and personalizes instruction to enhance or accelerate a student's success. Directing instruction to each student's strengths and content needs can minimize course dropout rates, increase student outcomes and the speed at which they are accomplished. The personalized learning approach focuses on providing an effective, customized, and efficient path of learning so that every student can participate in the learning process (Hussein & Al-Chalabi, 2020 ). Learning styles, on the other hand, represent an important issue in learning in the twenty-first century, with students expected to participate actively in developing self-understanding as well as their environment engagement. (Klasnja-Milicevic et al., 2011 ; Nuankaew et al., 2019 ; Truong, 2016 ).

In current conventional e-learning environments, instruction has traditionally followed a “one style fits all” approach, which means that all students are exposed to the same learning procedures. This type of learning does not take into account the different learning styles and preferences of students. Currently, the development of e-learning systems has accommodated and supported personalized learning, in which instruction is fitted to a students’ individual needs and learning styles (Beldagli & Adiguzel, 2010 ; Benhamdi et al., 2017 ; Pashler et al., 2008 ). Some personalized approaches let students choose content that matches their personality (Hussein & Al-Chalabi, 2020 ). The delivery of course materials is an important issue of personalized learning. Moreover, designing a well-designed, effective, adaptive e-learning system represents a challenge due to complication of adapting to the different needs of learners (Alshammari, 2016 ). Regardless of using e-learning claims that shifting to adaptive e-learning environments to be able to reinforce students' engagement. However, a learning environment cannot be considered adaptive if it is not flexible enough to accommodate students' learning styles. (Ennouamani & Mahani, 2017 ).

On the other hand, while student engagement has become a central issue in learning, it is also an indicator of educational quality and whether active learning occurs in classes. (Lee et al., 2019 ; Nkomo et al., 2021 ; Robinson & Hullinger, 2008 ). Veiga et al. ( 2014 ) suggest that there is a need for further research in engagement because assessing students’ engagement is a predictor of learning and academic progress. It is important to clarify the distinction between causal factors such as learning environment and outcome factors such as achievement. Accordingly, student engagement is an important research topic because it affects a student's final grade, and course dropout rate (Staikopoulos et al., 2015 ).

The Umm Al-Qura University strategic plan through common first-year deanship has focused on best practices that increase students' higher-order skills. These skills include communication skills, problem-solving skills, research skills, and creative thinking skills. Although the UQU action plan involves improving these skills through common first-year academic programs, the student's learning skills need to be encouraged and engaged more (Umm Al-Qura University Agency, 2020 ). As a result of the author's experience, The conventional methods of instruction in the "learning skills" course were observed, in which the content is presented to all students in one style that is dependent on understanding the content regardless of the diversity of their learning styles.

According to some studies (Alshammari & Qtaish, 2019 ; Lee & Kim, 2012 ; Shih et al., 2008 ; Verdú, et al., 2008 ; Yalcinalp & Avc, 2019 ), there is little attention paid to the needs and preferences of individual learners, and as a result, all learners are treated in the same way. More research into the impact of educational technologies on developing skills and performance among different learners is recommended. This “one-style-fits-all” approach implies that all learners are expected to use the same learning style as prescribed by the e-learning environment. Subsequently, a review of the literature revealed that an adaptive e-learning environment can affect learning outcomes to fill the identified gap. In conclusion: Adaptive e-learning environments rely on the learner's preferences and learning style as a reference that supports to create adaptation.

To confirm the above: the author conducted an exploratory study via an open interview that included some questions with a sample of 50 students in the learning skills department of common first-year. Questions asked about the difficulties they face when learning a "learning skills" course, what is the preferred way of course content. Students (88%) agreed that the way students are presented does not differ according to their differences and that they suffer from a lack of personal learning that is compatible with their style of work. Students (82%) agreed that they lack adaptive educational content that helps them to be engaged in the learning process. Accordingly, the author handled the research problem.

This research supplements to the existing body of knowledge on the subject. It is considered significant because it improves understanding challenges involved in designing the adaptive environments based on learning styles parameter. Subsequently, this paper is structured as follows: The next section presents the related work cited in the literature, followed by research methodology, then data collection, results, discussion, and finally, some conclusions and future trends are discussed.

Theoretical framework

This section briefly provides a thorough review of the literature about the adaptive E-learning environments based on learning styles.

Adaptive e-learning environments based on learning styles

The adaptive e-learning employment in higher education has been slower to evolve, and challenges that led to the slow implementation still exist. The learning management system offers the same tools to all learners, although individual learners need different details based on learning style and preferences. (Beldagli & Adiguzel, 2010 ; Kolekar et al., 2017 ). The interactive e-learning environment requisite evaluating the learner's desired learning style, before the course delivery, such as an online quiz or during the course delivery, such as tracking student reactions (DeCapua & Marshall, 2015 ).

In e-learning environments, adaptation is constructed on a series of well-designed processes to fit the instructional materials. The adaptive e-learning framework attempt to match instructional content to the learners' needs and styles. According to Qazdar et al. ( 2015 ), adaptive e-learning (AEL) environments rely on constructing a model of each learner's needs, preferences, and styles. It is well recognized that such adaptive behavior can increase learners' development and performance, thus enriching learning experience quality. (Shi et al., 2013 ). The following features of adaptive e-learning environments can be identified through diversity, interactivity, adaptability, feedback, performance, and predictability. Although adaptive framework taxonomy and characteristics related to various elements, adaptive learning includes at least three elements: a model of the structure of the content to be learned with detailed learning outcomes (a content model). The student's expertise based on success, as well as a method of interpreting student strengths (a learner model), and a method of matching the instructional materials and how it is delivered in a customized way (an instructional model) (Ali et al., 2019 ). The number of adaptive e-learning studies has increased over the last few years. Adaptive e-learning is likely to increase at an accelerating pace at all levels of instruction (Hussein & Al-Chalabi, 2020 ; Oxman & Wong, 2014 ).

Many studies assured the power of adaptive e-learning in delivering e-content for learners in a way that fitting their needs, and learning styles, which helps improve the process of students' acquisition of knowledge, experiences and develop their higher thinking skills (Ali et al., 2019 ; Behaz & Djoudi, 2012 ; Chun-Hui et al., 2017 ; Daines et al., 2016 ; Dominic et al., 2015 ; Mahnane et al., 2013 ; Vassileva, 2012 ). Student characteristics of learning style are recognized as an important issue and a vital influence in learning and are frequently used as a foundation to generate personalized learning experiences (Alshammari & Qtaish, 2019 ; El-Sabagh & Hamed, 2020 ; Hussein & Al-Chalabi, 2020 ; Klasnja-Milicevic et al., 2011 ; Normadhi et al., 2019 ; Ozyurt & Ozyurt, 2015 ).

The learning style is a parameter of designing adaptive e-learning environments. Individuals differ in their learning styles when interacting with the content presented to them, as many studies emphasized the relationship between e-learning and learning styles to be motivated in learning situations, consequently improving the learning outcomes (Ali et al., 2019 ; Alshammari, 2016 ; Alzain et al., 2018a , b ; Liang, 2012 ; Mahnane et al., 2013 ; Nainie et al., 2010 ; Velázquez & Assar, 2009 ). The word "learning style" refers to the process by which the learner organizes, processes, represents, and combines this information and stores it in his cognitive source, then retrieves the information and experiences in the style that reflects his technique of communicating them. (Fleming & Baume, 2006 ; Jaleel & Thomas, 2019 ; Jonassen & Grabowski, 2012 ; Klasnja-Milicevic et al., 2011 ; Nuankaew et al., 2019 ; Pashler et al., 2008 ; Willingham et al., 2105 ; Zhang, 2017 ). The concept of learning style is founded based on the fact that students vary in their styles of receiving knowledge and thought, to help them recognizing and combining information in their mind, as well as acquire experiences and skills. (Naqeeb, 2011 ). The extensive scholarly literature on learning styles is distributed with few strong experimental findings (Truong, 2016 ), and a few findings on the effect of adapting instruction to learning style. There are many models of learning styles (Aldosarim et al., 2018 ; Alzain et al., 2018a , 2018b ; Cletus & Eneluwe, 2020 ; Franzoni & Assar, 2009 ; Willingham et al., 2015 ), including the VARK model, which is one of the most well-known models used to classify learning styles. The VARK questionnaire offers better thought about information processing preferences (Johnson, 2009 ). Fleming and Baume ( 2006 ) developed the VARK model, which consists of four students' preferred learning types. The letter "V" represents for visual and means the visual style, while the letter "A" represents for auditory and means the auditory style, and the letter "R/W" represents "write/read", means the reading/writing style, and the letter "K" represents the word "Kinesthetic" and means the practical style. Moreover, VARK distinguishes the visual category further into graphical and textual or visual and read/write learners (Murphy et al., 2004 ; Leung, et al., 2014 ; Willingham et al., 2015 ). The four categories of The VARK Learning Style Inventory are shown in the Fig. 1 below.

figure 1

VARK learning styles

According to the VARK model, learners are classified into four groups representing basic learning styles based on their responses which have 16 questions, there are four potential responses to each question, where each answer agrees to one of the extremes of the dimension (Hussain, 2017 ; Silva, 2020 ; Zhang, 2017 ) to support instructors who use it to create effective courses for students. Visual learners prefer to take instructional materials and send assignments using tools such as maps, graphs, images, and other symbols, according to Fleming and Baume ( 2006 ). Learners who can read–write prefer to use written textual learning materials, they use glossaries, handouts, textbooks, and lecture notes. Aural learners, on the other hand, prefer to learn through spoken materials, dialogue, lectures, and discussions. Direct practice and learning by doing are preferred by kinesthetic learners (Becker et al., 2007 ; Fleming & Baume, 2006 ; Willingham et al., 2015 ). As a result, this research work aims to provide a comprehensive discussion about how these individual parameters can be applied in adaptive e-learning environment practices. Dominic et al., ( 2015 ) presented a framework for an adaptive educational system that personalized learning content based on student learning styles (Felder-Silverman learning model) and other factors such as learners' learning subject competency level. This framework allowed students to follow their adaptive learning content paths based on filling in "ils" questionnaire. Additionally, providing a customized framework that can automatically respond to students' learning styles and suggest online activities with complete personalization. Similarly, El Bachari et al. ( 2011 ) attempted to determine a student's unique learning style and then adapt instruction to that individual interests. Adaptive e-learning focused on learner experience and learning style has a higher degree of perceived usability than a non-adaptive e-learning system, according to Alshammari et al. ( 2015 ). This can also improve learners' satisfaction, engagement, and motivation, thus improving their learning.

According to the findings of (Akbulut & Cardak, 2012 ; Alshammari & Qtaish, 2019 ; Alzain et al., 2018a , b ; Shi et al., 2013 ; Truong, 2016 ), adaptation based on a combination of learning style, and information level yields significantly better learning gains. Researchers have recently initiated to focus on how to personalize e-learning experiences using personal characteristics such as the student's preferred learning style. Personal learning challenges are addressed by adaptive learning programs, which provide learners with courses that are fit to their specific needs, such as their learning styles.

  • Student engagement

Previous research has emphasized that student participation is a key factor in overcoming academic problems such as poor academic performance, isolation, and high dropout rates (Fredricks et al., 2004 ). Student participation is vital to student learning, especially in an online environment where students may feel isolated and disconnected (Dixson, 2015 ). Student engagement is the degree to which students consciously engage with a course's materials, other students, and the instructor. Student engagement is significant for keeping students engaged in the course and, as a result, in their learning (Barkley & Major, 2020 ; Lee et al., 2019 ; Rogers-Stacy, et al, 2017 ). Extensive research was conducted to investigate the degree of student engagement in web-based learning systems and traditional education systems. For instance, using a variety of methods and input features to test the relationship between student data and student participation (Hussain et al., 2018 ). Guo et al. ( 2014 ) checked the participation of students when they watched videos. The input characteristics of the study were based on the time they watched it and how often students respond to the assessment.

Atherton et al. ( 2017 ) found a correlation between the use of course materials and student performance; course content is more expected to lead to better grades. Pardo et al., ( 2016 ) found that interactive students with interactive learning activities have a significant impact on student test scores. The course results are positively correlated with student participation according to previous research. For example, Atherton et al. ( 2017 ) explained that students accessed learning materials online and passed exams regularly to obtain higher test scores. Other studies have shown that students with higher levels of participation in questionnaires and course performance tend to perform well (Mutahi et al., 2017 ).

Skills, emotion, participation, and performance, according to Dixson ( 2015 ), were factors in online learning engagement. Skills are a type of learning that includes things like practicing on a daily foundation, paying attention while listening and reading, and taking notes. Emotion refers to how the learner feels about learning, such as how much you want to learn. Participation refers to how the learner act in a class, such as chat, discussion, or conversation. Performance is a result, such as a good grade or a good test score. In general, engagement indicated that students spend time, energy learning materials, and skills to interact constructively with others in the classroom, and at least participate in emotional learning in one way or another (that is, be motivated by an idea, willing to learn and interact). Student engagement is produced through personal attitudes, thoughts, behaviors, and communication with others. Thoughts, effort, and feelings to a certain level when studying. Therefore, the student engagement scale attempts to measure what students are doing (thinking actively), how they relate to their learning, and how they relate to content, faculty members, and other learners including the following factors as shown in Fig.  2 . (skills, participation/interaction, performance, and emotions). Hence, previous research has moved beyond comparing online and face-to-face classes to investigating ways to improve online learning (Dixson, 2015 ; Gaytan & McEwen, 2007 ; Lévy & Wakabayashi, 2008 ; Mutahi et al., 2017 ). Learning effort, involvement in activities, interaction, and learning satisfaction, according to reviews of previous research on student engagement, are significant measures of student engagement in learning environments (Dixson, 2015 ; Evans et al., 2017 ; Lee et al., 2019 ; Mutahi et al., 2017 ; Rogers-Stacy et al., 2017 ). These results point to several features of e-learning environments that can be used as measures of student participation. Successful and engaged online learners learn actively, have the psychological inspiration to learn, make good use of prior experience, and make successful use of online technology. Furthermore, they have excellent communication abilities and are adept at both cooperative and self-directed learning (Dixson, 2015 ; Hong, 2009 ; Nkomo et al., 2021 ).

figure 2

Engagement factors

Overview of designing the adaptive e-learning environment

The paper follows the (ADDIE) Instructional Design Model: analysis, design, develop, implement, and evaluate to answer the first research question. The adaptive learning environment offers an interactive decentralized media environment that takes into account individual differences among students. Moreover, the environment can spread the culture of self-learning, attract students, and increase their engagement in learning.

Any learning environment that is intended to accomplish a specific goal should be consistent to increase students' motivation to learn. so that they have content that is personalized to their specific requirements, rather than one-size-fits-all content. As a result, a set of instructional design standards for designing an adaptive e-learning framework based on learning styles was developed according to the following diagram (Fig. 3 ).

figure 3

The ID (model) of the adaptive e-learning environment

According to the previous figure, The analysis phase included identifying the course materials and learning tools (syllabus and course plan modules) used for the study. The learning objectives were included in the high-level learning objectives (C4-C6: analysis, synthesis, evaluation).

The design phase included writing SMART objectives, the learning materials were written within the modules plan. To support adaptive learning, four content paths were identified, choosing learning models, processes, and evaluation. Course structure and navigation were planned. The adaptive structural design identified the relationships between the different components, such as introduction units, learning materials, quizzes. Determining the four path materials. The course instructional materials were identified according to the following Figure 4 .

figure 4

Adaptive e-course design

The development phase included: preparing and selecting the media for the e-course according to each content path in an adaptive e-learning environment. During this process, the author accomplished the storyboard and the media to be included on each page of the storyboard. A category was developed for the instructional media for each path (Fig. 5 )

figure 5

Roles and deployment diagram of the adaptive e-learning environment

The author developed a learning styles questionnaire via a mobile App. as follows: https://play.google.com/store/apps/details?id=com.pointability.vark . Then, the students accessed the adaptive e-course modules based on their learning styles.

The Implementation phase involved the following: The professional validation of the course instructional materials. Expert validation is used to evaluate the consistency of course materials (syllabi and modules). The validation was performed including the following: student learning activities, learning implementation capability, and student reactions to modules. The learner's behaviors, errors, navigation, and learning process are continuously geared toward improving the learner's modules based on the data the learner gathered about him.

The Evaluation phase included five e-learning specialists who reviewed the adaptive e-learning. After that, the framework was revised based on expert recommendations and feedback. Content assessment, media evaluation in three forms, instructional design, interface design, and usage design included in the evaluation. Adaptive learners checked the proposed framework. It was divided into two sections. Pilot testing where the proposed environment was tested by ten learners who represented the sample in the first phase. Each learner's behavior was observed, questions were answered, and learning control, media access, and time spent learning were all verified.

Research methodology

Research purpose and questions.

This research aims to investigate the impact of designing an adaptive e-learning environment on the development of students' engagement. The research conceptual framework is illustrated in Fig.  6 . Therefore, the articulated research questions are as follows: the main research question is "What is the impact of an adaptive e-learning environment based on (VARK) learning styles on developing students' engagement? Accordingly, there are two sub research questions a) "What is the instructional design of the adaptive e-learning environment?" b) "What is the impact of an adaptive e-learning based on (VARK) learning styles on development students' engagement (skills, participation, performance, emotional) in comparison with conventional e-learning?".

figure 6

The conceptual framework (model) of the research questions

Research hypotheses

The research aims to verify the validity of the following hypothesis:

There is no statistically significant difference between the students' mean scores of the experimental group that exposed to the adaptive e-learning environment and the scores of the control group that was exposed to the conventional e-learning environment in pre-application of students' engagement scale.

There is a statistically significant difference at the level of (0.05) between the students' mean scores of the experimental group (adaptive e-learning) and the scores of the control group (conventional e-learning) in post-application of students' engagement factors in favor of the experimental group.

Research design

This research was a quasi-experimental research with the pretest-posttest. Research variables were independent and dependent as shown in the following Fig. 7 .

figure 7

Research "Experimental" design

Both groups were informed with the learning activities tracks, the experimental group was instructed to use the adaptive learning environment to accomplish the learning goals; on the other hand, the control group was exposed to the conventional e-learning environment without the adaptive e-learning parameters.

Research participants

The sample consisted of students studying the "learning skills" course in the common first-year deanship aged between (17–18) years represented the population of the study. All participants were chosen in the academic year 2109–2020 at the first term which was taught by the same instructors. The research sample included two classes (118 students), selected randomly from the learning skills department. First-group was randomly assigned as the control group (N = 58, 31 males and 27 females), the other was assigned as experimental group (N = 60, 36 males and 24 females) was assigned to the other class. The following Table 1 shows the distribution of students' sample "Demographics data".

The instructional materials were not presented to the students before. The control group was expected to attend the conventional e-learning class, where they were provided with the learning environment without adaptive e-learning parameter based on the learning styles that introduced the "learning skills" course. The experimental group was exposed to the use of adaptive e-learning based on learning styles to learn the same course instructional materials within e-course. Moreover, all the student participants were required to read the guidelines to indicate their readiness to participate in the research experiment with permission.

Research instruments

In this research, the measuring tools included the VARK questionnaire and the students' engagement scale including the following factors (skills, participation/interaction, performance, emotional). To begin, the pre-post scale was designed to assess the level of student engagement related to the "learning skills" course before and after participating in the experiment.

VARK questionnaire

Questionnaires are a common method for collecting data in education research (McMillan & Schumacher, 2006 ). The VARK questionnaire had been organized electronically and distributed to the student through the developed mobile app and registered on the UQU system. The questionnaire consisted of 16 items within the scale as MCQ classified into four main factors (kinesthetic, auditory, visual, and R/W).

Reliability and Validity of The VARK questionnaire

For reliability analysis, Cronbach’s alpha is used for evaluating research internal consistency. Internal consistency was calculated through the calculation of correlation of each item with the factor to which it fits and correlation among other factors. The value of 0.70 and above are normally recognized as high-reliability values (Hinton et al., 2014 ). The Cronbach's Alpha correlation coefficient for the VARK questionnaire was 0.83, indicating that the questionnaire was accurate and suitable for further research.

Students' engagement scale

The engagement scale was developed after a review of the literature on the topic of student engagement. The Dixson scale was used to measure student engagement. The scale consisted of 4 major factors as follows (skills, participation/interaction, performance, emotional). The author adapted the original "Dixson scale" according to the following steps. The Dixson scale consisted of 48 statements was translated and accommodated into Arabic by the author. After consulting with experts, the instrument items were reduced to 27 items after adaptation according to the university learning environment. The scale is rated on a 5-point scale.

The final version of the engagement scale comprised 4 factors as follows: The skills engagement included (ten items) to determine keeping up with, reading instructional materials, and exerting effort. Participation/interaction engagement involved (five items) to measure having fun, as well as regularly engaging in group discussion. The performance engagement included (five items) to measure test performance and receiving a successful score. The emotional engagement involved (seven items) to decide whether or not the course was interesting. Students can access to respond engagement scale from the following link: http://bit.ly/2PXGvvD . Consequently, the objective of the scale is to measure the possession of common first-year students of the basic engagement factors before and after instruction with adaptive e-learning compared to conventional e-learning.

Reliability and validity of the engagement scale

The alpha coefficient of the scale factors scores was presented. All four subscales have a strong degree of internal accuracy (0.80–0.87), indicating strong reliability. The overall reliability of the instruments used in this study was calculated using Alfa-alpha, Cronbach's with an alpha value of 0.81 meaning that the instruments were accurate. The instruments used in this research demonstrated strong validity and reliability, allowing for an accurate assessment of students' engagement in learning. The scale was applied to a pilot sample of 20 students, not including the experimental sample. The instrument, on the other hand, had a correlation coefficient of (0.74–0.82), indicating a degree of validity that enables the instrument's use. Table 2 shows the correlation coefficient and Cronbach's alpha based on the interaction scale.

On the other hand, to verify the content validity; the scale was to specialists to take their views on the clarity of the linguistic formulation and its suitability to measure students' engagement, and to suggest what they deem appropriate in terms of modifications.

Research procedures

To calculate the homogeneity and group equivalence between both groups, the validity of the first hypothesis was examined which stated "There is no statistically significant difference between the students' mean scores of the experimental group that exposed to the adaptive e-learning environment and the scores of the control group that was exposed to the conventional e-learning environment in pre-application of students' engagement scale", the author applied the engagement scale to both groups beforehand, and the scores of the pre-application were examined to verify the equivalence of the two groups (experimental and control) in terms of students' engagement.

The t-test of independent samples was calculated for the engagement scale to confirm the homogeneity of the two classes before the experiment. The t-values were not significant at the level of significance = 0.05, meaning that the two groups were homogeneous in terms of students' engagement scale before the experiment.

Since there was no significant difference in the mean scores of both groups ( p  > 0.05), the findings presented in Table 3 showed that there was no significant difference between both experimental and control groups in engagement as a whole, and each student engagement factor separately. The findings showed that the two classes were similar before start of research experiment.

Learner content path in adaptive e-learning environment

The previous well-designed processes are the foundation for adaptation in e-learning environments. There are identified entries for accommodating materials, including classification depending on learning style.: kinesthetic, auditory, visual, and R/W. The present study covered the 1st semester during the 2019/2020 academic year. The course was divided into modules that concentrated on various topics; eleven of the modules included the adaptive learning exercise. The exercises and quizzes were assigned to specific textbook modules. To reduce irrelevant variation, all objects of the course covered the same content, had equal learning results, and were taught by the same instructor.

The experimental group—in which students were asked to bring smartphones—was taught, where the how-to adaptive learning application for adaptive learning was downloaded, and a special account was created for each student, followed by access to the channel designed by the through the application, and the students were provided with instructions and training on how entering application with the appropriate default element of the developed learning objects, while the control group used the variety of instructional materials in the same course for the students.

In this adaptive e-course, students in the experimental group are presented with a questionnaire asked to answer that questions via a developed mobile App. They are provided with four choices. Students are allowed to answer the questions. The correct answer is shown in the students' responses to the results, but the learning module is marked as incomplete. If a student chooses to respond to a question, the correct answer is found immediately, regardless of the student's reaction.

Figure  8 illustrates a visual example from learning styles identification through responding VARK Questionnaire. The learning process experienced by the students in this adaptive Learning environment is as shown in Fig.  4 . Students opened the adaptive course link by tapping the following app " https://play.google.com/store/apps/details?id=com.pointability.vark ," which displayed the appropriate positioning of both the learning skills course and the current status of students. It directed students to the learning skills that they are interested in learning more. Once students reached a specific situation in the e-learning environment, they could access relevant digital instructional materials. Students were then able to progress through the various styles offered by the proposed method, giving them greater flexibility in their learning pace.

figure 8

Visual example from "learning of the learning styles" identification and adaptive e-learning course process

The "flowchart" diagram below illustrates the learner's path in an adaptive e-learning environment, depending on the (VARK) learning styles (visual, auditory, kinesthetic, reading/writing) (Fig. 9 ).

figure 9

Student learning path

According to the previous design model of the adaptive framework, the students responded "Learning Styles" questionnaire. Based on each student's results, the orientation of students will direct to each of "Visual", "Aural", "Read-Write", and "Kinesthetic". The student took at the beginning the engagement scale online according to their own pace. When ready, they responded "engagement scale".

Based on the results, the system produced an individualized learning plan to fill in the gap based on the VARK questionnaire's first results. The learner model represents important learner characteristics such as personal information, knowledge level, and learning preferences. Pre and post measurements were performed for both experimental and control groups. The experimental group was exposed only to treatment (using the adaptive learning environment).

To address the second question, which states: “What is the impact "effect" of adaptive e-learning based on (VARK) learning styles on development students' engagement (skills, participation/interaction, performance, emotional) in comparison with conventional e-learning?

The validity of the second hypothesis of the research hypothesis was tested, which states " There is a statistically significant difference at the level of (0.05) between the students' mean scores of the experimental group (adaptive e-learning) and the scores of the control group (conventional e-learning) in post-application of students' engagement factors in favor of the experimental group". To test the hypothesis, the arithmetic means, standard deviations, and "T"-test values were calculated for the results of the two research groups in the application of engagement scale factors".

Table 4 . indicates that students in the experimental group had significantly higher mean of engagement post-test (engagement factors items) scores than students in the control group ( p  < 0.05).

The experimental research was performed to evaluate the impact of the proposed adaptive e-learning. Independent sample t-tests were used to measure the previous behavioral engagement of the two groups related to topic of this research. Subsequently, the findings stated that the experimental group students had higher learning achievement than those who were taught using the conventional e-learning approach.

To verify the effect size of the independent variable in terms of the dependent variable, Cohen (d) was used to investigate that adaptive learning can significantly students' engagement. According to Cohen ( 1992 ), ES of 0.20 is small, 0.50 is medium, and 0.80 is high. In the post-test of the student engagement scale, however, the effect size between students' scores in the experimental and control groups was calculated using (d and r) using means and standard deviations. Cohen's d = 0.826, and Effect-size r = 0.401, according to the findings. The ES of 0.824 means that the treated group's mean is in the 79th percentile of the control group (Large effect). Effect sizes can also be described as the average percentile rank of the average treated learner compared to the average untreated learner in general. The mean of the treated group is at the 50th percentile of the untreated group, indicating an ES of 0.0. The mean of the treated group is at the 79th percentile of the untreated group, with an ES of 0.8. The results showed that the dependent variable was strongly influenced in the four behavioral engagement factors: skills: performance, participation/interaction, and emotional, based on the fact that effect size is a significant factor in determining the research's strength.

Discussions and limitations

This section discusses the impact of an adaptive e-learning environment on student engagement development. This paper aimed to design an adaptive e-learning environment based on learning style parameters. The findings revealed that factors correlated to student engagement in e-learning: skills, participation/interaction, performance, and emotional. The engagement factors are significant because they affect learning outcomes (Nkomo et al., 2021 ). Every factor's items correlate to cognitive process-related activities. The participation/interaction factor, for example, referred to, interactions with the content, peers, and instructors. As a result, student engagement in e-learning can be predicted by interactions with content, peers, and instructors. The results are in line with previous research, which found that customized learning materials are important for increasing students' engagement. Adaptive e-learning based on learning styles sets a strong emphasis on behavioral engagement, in which students manage their learning while actively participating in online classes to adapt instruction according to each learning style. This leads to improved learning outcomes (Al-Chalabi & Hussein, 2020 ; Chun-Hui et al., 2017 ; Hussein & Al-Chalabi, 2020 ; Pashler et al., 2008 ). The experimental findings of this research showed that students who learned through adaptive eLearning based on learning styles learned more; as learning styles are reflected in this research as one of the generally assumed concerns as a reference for adapting e-content path. Students in the experimental group reported that the adaptive eLearning environment was very interesting and able to attract their attention. Those students also indicated that the adaptive eLearning environment was particularly useful because it provided opportunities for them to recall the learning content, thus enhancing their overall learning impression. This may explain why students in the experimental group performed well in class and showed more enthusiasm than students in the control group. This research compared an adaptive e-learning environment to a conventional e-learning approach toward engagement in a learning skills course through instructional content delivery and assessment. It can also be noticed that the experimental group had higher participation than the control group, indicating that BB activities were better adapted to the students' learning styles. Previous studies have agreed on the effectiveness of adaptive learning; it provides students with quality opportunity that is adapted to their learning styles, and preferences (Alshammari, 2016 ; Hussein & Al-Chalabi, 2020 ; Roy & Roy, 2011 ; Surjono, 2014 ). However, it should be noted that this study is restricted to one aspect of content adaptation and its factors, which is learning materials adapting based on learning styles. Other considerations include content-dependent adaptation. These findings are consistent with other studies, such as (Alshammari & Qtaish, 2019 ; Chun-Hui et al., 2017 ), which have revealed the effectiveness of the adaptive e-learning environment. This research differs from others in that it reflects on the Umm Al-Qura University as a case study, VARK Learning styles selection, engagement factors, and the closed learning management framework (BB).

The findings of the study revealed that adaptive content has a positive impact on adaptive individuals' achievement and student engagement, based on their learning styles (kinesthetic; auditory; visual; read/write). Several factors have contributed to this: The design of adaptive e-content for learning skills depended on introducing an ideal learning environment for learners, and providing support for learning adaptation according to the learning style, encouraging them to learn directly, achieving knowledge building, and be enjoyable in the learning process. Ali et al. ( 2019 ) confirmed that, indicating that education is adapted according to each individual's learning style, needs, and characteristics. Adaptive e-content design that allows different learners to think about knowledge by presenting information and skills in a logical sequence based on the adaptive e-learning framework, taking into account its capabilities as well as the diversity of its sources across the web, and these are consistent with the findings of (Alshammari & Qtaish, 2019 ).

Accordingly, the previous results are due to the following: good design of the adaptive e-learning environment in light of the learning style and educational preferences according to its instructional design (ID) standards, and the provision of adaptive content that suits the learners' needs, characteristics, and learning style, in addition to the diversity of course content elements (texts, static images, animations, and video), variety of tests and activities, diversity of methods of reinforcement, return and support from the instructor and peers according to the learning style, as well as it allows ease of use, contains multiple and varied learning sources, and allows referring to the same point when leaving the environment.

Several studies have shown that using adaptive eLearning technologies allows students to improve their learning knowledge and further enhance their engagement in issues such as "skills, performance, interaction, and emotional" (Ali et al., 2019 ; Graf & Kinshuk, 2007 ; Murray & Pérez, 2015 ); nevertheless, Murray and Pérez ( 2015 ) revealed that adaptive learning environments have a limited impact on learning outcome.

The restricted empirical findings on the efficacy of adapting teaching to learning style are mixed. (Chun-Hui et al., 2017 ) demonstrated that adaptive eLearning technologies can be beneficial to students' learning and development. According to these findings, adaptive eLearning can be considered a valuable method for learning because it can attract students' attention and promote their participation in educational activities. (Ali et al., 2019 ); however, only a few recent studies have focused on how adaptive eLearning based on learning styles fits in diverse cultural programs. (Benhamdi et al., 2017 ; Pashler et al., 2008 ).

The experimental results revealed that the proposed environment significantly increased students' learning achievements as compared to the conventional e-learning classroom (without adaptive technology). This means that the proposed environment's adaptation could increase students' engagement in the learning process. There is also evidence that an adaptive environment positively impacts other aspects of quality such as student engagement (Murray & Pérez, 2015 ).

Conclusions and implications

Although this field of research has stimulated many interests in recent years, there are still some unanswered questions. Some research gaps are established and filled in this study by developing an active adaptive e-learning environment that has been shown to increase student engagement. This study aimed to design an adaptive e-learning environment for performing interactive learning activities in a learning skills course. The main findings of this study revealed a significant difference in learning outcomes as well as positive results for adaptive e-learning students, indicating that it may be a helpful learning method for higher education. It also contributed to the current adaptive e-learning literature. The findings revealed that adaptive e-learning based on learning styles could help students stay engaged. Consequently, adaptive e-learning based on learning styles increased student engagement significantly. According to research, each student's learning style is unique, and they prefer to use different types of instructional materials and activities. Furthermore, students' preferences have an impact on the effectiveness of learning. As a result, the most effective learning environment should adjust its output to the needs of the students. The development of high-quality instructional materials and activities that are adapted to students' learning styles will help them participate and be more motivated. In conclusion, learning styles are a good starting point for creating instructional materials based on learning theories.

This study's results have important educational implications for future studies on the effect of adaptive e-learning on student interaction. First, the findings may provide data to support the development and improvement of adaptive environments used in blended learning. Second, the results emphasize the need for more quasi-experimental and descriptive research to better understand the benefits and challenges of incorporating adaptive e-learning in higher education institutions. Third, the results of this study indicate that using an adaptive model in an adaptive e-learning environment will encourage, motivate, engage, and activate students' active learning, as well as facilitate their knowledge construction, rather than simply taking in information passively. Fourth, new research is needed to design effective environments in which adaptive learning can be used in higher education institutions to increase academic performance and motivation in the learning process. Finally, the study shows that adaptive e-learning allows students to learn individually, which improves their learning and knowledge of course content, such as increasing their knowledge of learning skills course topics beyond what they can learn in a conventional e-learning classroom.

Contribution to research

The study is intended to provide empirical evidence of adaptive e-learning on student engagement factors. This research, on the other hand, has practical implications for higher education stakeholders, as it is intended to provide university faculty members with learning approaches that will improve student engagement. It is also expected to offer faculty a framework for designing personalized learning environments based on learning styles in various learning situations and designing more adaptive e-learning environments.

Research implication

Students with their preferred learning styles are more likely to enjoy learning if they are provided with a variety of instructional materials such as references, interactive media, videos, podcasts, storytelling, simulation, animation, problem-solving, games, and accessible educational tools in an e-learning environment. Also, different learning strategies can be accommodated. Other researchers would be able to conduct future studies on the use of the "adaptive e-learning" approach throughout the instructional process, at different phases of learning, and in various e-courses as a result of the current study. Meanwhile, the proposed environment's positive impact on student engagement gained considerable interest for future educational applications. Further research on learning styles in different university colleges could contribute to a foundation for designing adaptive e-courses based on students' learning styles and directing more future research on learning styles.

Implications for practice or policy:

Adaptive e-learning focused on learning styles would help students become more engaged.

Proving the efficacy of an adaptive e-learning environment via comparison with conventional e-learning .

Availability of data and materials

The author confirms that the data supporting the findings of this study are based on the research tools which were prepared and explained by the author and available on the links stated in the research instruments sub-section. The data analysis that supports the findings of this study is available on request from the corresponding author.

Akbulut, Y., & Cardak, C. (2012). Adaptive educational hypermedia accommodating learning styles: A content analysis of publications from 2000 to 2011. Computers & Education . https://doi.org/10.1016/j.compedu.2011.10.008 .

Article   Google Scholar  

Al-Chalabi, H., & Hussein, A. (2020). Analysis & implementation of personalization parameters in the development of computer-based adaptive learning environment. SAR Journal Science and Research., 3 (1), 3–9. https://doi.org/10.18421//SAR31-01 .

Aldosari, M., Aljabaa, A., Al-Sehaibany, F., & Albarakati, S. (2018). Learning style preferences of dental students at a single institution in Riyadh Saudi Arabia, evaluated using the VARK questionnaire . Advances in Medical Education and Practice. https://doi.org/10.2147/AMEP.S157686 .

Ali, N., Eassa, F., & Hamed, E. (2019). Personalized Learning Style for Adaptive E-Learning System, International Journal of Advanced Trends in Computer Science and Engineering . 223-230. Retrieved June 26, 2020 from http://www.warse.org/IJATCSE/static/pdf/file/ijatcse4181.12019.pdf .

Alshammari, M., & Qtaish, A. (2019). Effective adaptive e-learning systems according to learning style and knowledge level. JITE Research, 18 , 529–547. https://doi.org/10.28945/4459 .

Alshammari, M. (2016). Adaptation based on learning style and knowledge level in e-learning systems, Ph.D. thesis , University of Birmingham.  Retrieved April 18, 2019 from http://etheses.bham.ac.uk//id/eprint/6702/ .

Alshammari, M., Anane, R., & Hendley, R. (2015). Design and Usability Evaluation of Adaptive E-learning Systems based on Learner Knowledge and Learning Style. Human-Computer Interaction Conference- INTERACT , Vol. (9297), (pp. 157–186). https://doi.org/10.1007/978-3-319-22668-2_45 .

Alzain, A., Clack, S., Jwaid, A., & Ireson, G. (2018a). Adaptive education based on learning styles: Are learning style instruments precise enough. International Journal of Emerging Technologies in Learning (iJET), 13 (9), 41–52. https://doi.org/10.3991/ijet.v13i09.8554 .

Alzain, A., Clark, S., Ireson, G., & Jwaid, A. (2018b). Learning personalization based on learning style instruments. Advances in Science Technology and Engineering Systems Journal . https://doi.org/10.25046/aj030315 .

Atherton, M., Shah, M., Vazquez, J., Griffiths, Z., Jackson, B., & Burgess, C. (2017). Using learning analytics to assess student engagement and academic outcomes in open access enabling programs”. Journal of Open, Distance and e-Learning, 32 (2), 119–136.

Barkley, E., & Major, C. (2020). Student engagement techniques: A handbook for college faculty . Jossey-Bass . 10:047028191X.

Google Scholar  

Becker, K., Kehoe, J., & Tennent, B. (2007). Impact of personalized learning styles on online delivery and assessment. Campus-Wide Information Systems . https://doi.org/10.1108/10650740710742718 .

Behaz, A., & Djoudi, M. (2012). Adaptation of learning resources based on the MBTI theory of psychological types. IJCSI International Journal of Computer Science, 9 (2), 135–141.

Beldagli, B., & Adiguzel, T. (2010). Illustrating an ideal adaptive e-learning: A conceptual framework. Procedia - Social and Behavioral Sciences, 2 , 5755–5761. https://doi.org/10.1016/j.sbspro.2010.03.939 .

Benhamdi, S., Babouri, A., & Chiky, R. (2017). Personalized recommender system for e-Learning environment. Education and Information Technologies, 22 , 1455–1477. https://doi.org/10.1007/s10639-016-9504-y .

Chen, P., Lambert, A., & Guidry, K. (2010). Engaging online learners: The impact of Web-based learning technology on college student engagement. Computers & Education, 54 , 1222–1232.

Chun-Hui, Wu., Chen, Y.-S., & Chen, T. C. (2017). An adaptive e-learning system for enhancing learning performance: based on dynamic scaffolding theory. Eurasia Journal of Mathematics, Science and Technology Education. https://doi.org/10.12973/ejmste/81061 .

Cletus, D., & Eneluwe, D. (2020). The impact of learning style on student performance: mediate by personality. International Journal of Education, Learning and Training. https://doi.org/10.24924/ijelt/2019.11/v4.iss2/22.47Desmond .

Cohen, J. (1992). Statistical power analysis. Current Directions in Psychological Science., 1 (3), 98–101. https://doi.org/10.1111/1467-8721.ep10768783 .

Daines, J., Troka, T. and Santiago, J. (2016). Improving performance in trigonometry and pre-calculus by incorporating adaptive learning technology into blended models on campus. https://doi.org/10.18260/p.25624 .

DeCapua, A. & Marshall, H. (2015). Implementing a Mutually Adaptive Learning Paradigm in a Community-Based Adult ESL Literacy Class. In M. Santos & A. Whiteside (Eds.). Low Educated Second Language and Literacy Acquisition. Proceedings of the Ninth Symposium (pps. 151-171). Retrieved Nov. 14, 2020 from https://www.researchgate.net/publication/301355138_Implementing_a_Mutually_Adaptive_Learning_Paradigm_in_a_Community-Based_Adult_ESL_Literacy_Class .

Dixson, M. (2015). Measuring student engagement in the online course: The online student engagement scale (OSE). Online Learning . https://doi.org/10.24059/olj.v19i4.561 .

Dominic, M., Xavier, B., & Francis, S. (2015). A Framework to Formulate Adaptivity for Adaptive e-Learning System Using User Response Theory. International Journal of Modern Education and Computer Science, 7 , 23. https://doi.org/10.5815/ijmecs.2015.01.04 .

El Bachari, E., Abdelwahed, E., & M., El. . (2011). E-Learning personalization based on Dynamic learners’ preference. International Journal of Computer Science and Information Technology., 3 , 200–216. https://doi.org/10.5121/ijcsit.2011.3314 .

El-Sabagh, H. A., & Hamed, E. (2020). The Relationship between Learning-Styles and Learning Motivation of Students at Umm Al-Qura University. Egyptian Association for Educational Computer Journal . https://doi.org/10.21608/EAEC.2020.25868.1015 ISSN-Online: 2682-2601.

Ennouamani, S., & Mahani, Z. (2017). An overview of adaptive e-learning systems. Eighth International ConfeRence on Intelligent Computing and Information Systems (ICICIS) . https://doi.org/10.1109/INTELCIS.2017.8260060 .

Evans, S., Steele, J., Robertson, S., & Dyer, D. (2017). Personalizing post titles in the online classroom: A best practice? Journal of Educators Online, 14 (2), 46–54.

Fleming, N., & Baume, D. (2006). Learning styles again: VARKing up the Right Tree! Educational Developments, 7 , 4–7.

Franzoni, A., & Assar, S. (2009). Student learning style adaptation method based on teaching strategies and electronic media. Journal of Educational Technology & Society , 12(4), 15–29. Retrieved March 21, 2020, from http://www.jstor.org/stable/jeductechsoci.12.4.15 .

Fredricks, J., Blumenfeld, P., & Paris, A. (2004). School Engagement: Potential of the Concept . State of the Evidence: Review of Educational Research. https://doi.org/10.3102/00346543074001059 .

Book   Google Scholar  

Gaytan, J., & McEwen, M. (2007). Effective Online Instructional and Assessment Strategies. American Journal of Distance Education, 21 (3), 117–132. https://doi.org/10.1080/08923640701341653 .

Graf, S. & Kinshuk. K. (2007). Providing Adaptive Courses in Learning Management Systems with respect to Learning Styles. Proceeding of the World Conference on eLearning in Corporate. Government. Healthcare. and Higher Education (2576–2583). Association for the Advancement of Computing in Education (AACE). Retrieved January 18, 2020 from  https://www.learntechlib.org/primary/p/26739/ . ISBN 978-1-880094-63-1.

Guo, P., Kim, V., & Rubin, R. (2014). How video production affects student engagement: an empirical study of MOOC videos. Proceedings of First ACM Conference on Learning @ Scale Confernce . March 2014, (pp. 41-50). https://doi.org/10.1145/2556325.2566239 .

Hinton, P. R., Brownlow, C., McMurray, I., & Cozens, B. (2014). SPSS Explained (2nd ed., pp. 339–354). Routledge Taylor & Francis Group.

Hong, S. (2009). Developing competency model of learners in distance universities. Journal of Educational Technology., 25 , 157–186.

Hussain, I. (2017). Pedagogical implications of VARK model of learning. Journal of Literature, Languages and Linguistics, 38 , 33–37.

Hussain, M., Zhu, W., Zhang, W., & Abidi, S. (2018). Student engagement predictions in an e-learning system and their impact on student course assessment scores. Computational Intelligence, and Neuroscience. https://doi.org/10.1155/2018/6347186 .

Hussein, A., & Al-Chalabi, H. (2020). Pedagogical Agents in an Adaptive E-learning System. SAR Journal of Science and Research., 3 , 24–30. https://doi.org/10.18421/SAR31-04 .

Jaleel, S., & Thomas, A. (2019). Learning styles theories and implications for teaching learning . Horizon Research Publishing. 978-1-943484-25-6.

Johnson, M. (2009). Evaluation of Learning Style for First-Year Medical Students. Int J Schol Teach Learn . https://doi.org/10.20429/ijsotl.2009.030120 .

Jonassen, D. H., & Grabowski, B. L. (2012). Handbook of individual differences, learning, and instruction. Routledge . https://doi.org/10.1016/0022-4405(95)00013-C .

Klasnja-Milicevic, A., Vesin, B., Ivanovic, M., & Budimac, Z. (2011). E-Learning personalization based on hybrid recommendation strategy and learning style identification. Computers & Education, 56 (3), 885–899. https://doi.org/10.1016/j.compedu.2010.11.001 .

Kolekar, S. V., Pai, R. M., & Manohara Pai, M. M. (2017). Prediction of learner’s profile based on learning styles in adaptive e-learning system. International Journal of Emerging Technologies in Learning, 12 (6), 31–51. https://doi.org/10.3991/ijet.v12i06.6579 .

Lee, J., & Kim, D. (2012). Adaptive learning system applied bruner’ EIS theory. International Conference on Future Computer Supported Education, IERI Procedia, 2 , 794–801. https://doi.org/10.1016/j.ieri.2012.06.173 .

Lee, J., Song, H.-D., & Hong, A. (2019). Exploring factors, and indicators for measuring students’ sustainable engagement in e-learning. Sustainability, 11 , 985. https://doi.org/10.3390/su11040985 .

Leung, A., McGregor, M., Sabiston, D., & Vriliotis, S. (2014). VARK learning styles and student performance in principles of Micro-vs. Macro-Economics. Journal of Economics and Economic Education Research, 15 (3), 113.

Lévy, P. & Wakabayashi, N. (2008). User's appreciation of engagement in service design: The case of food service design. Proceedings of International Service Innovation Design Conference 2008 - ISIDC08 . Busan, Korea. Retrieved October 28, 2019 from https://www.researchgate.net/publication/230584075 .

Liang, J. S. (2012). The effects of learning styles and perceptions on application of interactive learning guides for web-based. Proceedings of Australasian Association for Engineering Education Conference AAEE . Melbourne, Australia. Retrieved October 22, 2019 from https://aaee.net.au/wpcontent/uploads/2018/10/AAEE2012-Liang.-Learning_styles_and_perceptions_effects_on_interactive_learning_guide_application.pdf .

Mahnane, L., Laskri, M. T., & Trigano, P. (2013). A model of adaptive e-learning hypermedia system based on thinking and learning styles. International Journal of Multimedia and Ubiquitous Engineering, 8 (3), 339–350.

Markey, M. K. & Schmit, K, J. (2008). Relationship between learning style Preference and instructional technology usage. Proceedings of American Society for Engineering Education Annual Conference & Expodition . Pittsburgh, Pennsylvania. Retrieved March 15, 2020 from https://peer.asee.org/3173 .

McMillan, J., & Schumacher, S. (2006). Research in education: Evidence-based inquiry . Pearson.

Murphy, R., Gray, S., Straja, S., & Bogert, M. (2004). Student learning preferences and teaching implications: Educational methodologies. Journal of Dental Education, 68 (8), 859–866.

Murray, M., & Pérez, J. (2015). Informing and performing: A study comparing adaptive learning to traditional learning. Informing Science. The International Journal of an Emerging Transdiscipline , 18, 111–125. Retrieved Febrauary 4, 2021 from http://www.inform.nu/Articles/Vol18/ISJv18p111-125Murray1572.pdf .

Mutahi, J., Kinai, A. , Bore, N. , Diriye, A. and Weldemariam, K. (2017). Studying engagement and performance with learning technology in an African classroom, Proceedings of Seventh International Learning Analytics & Knowledge Conference , (pp. 148–152), Canada: Vancouver.

Nainie, Z., Siraj, S., Abuzaiad, R. A., & Shagholi, R. (2010). Hypothesized learners’ technology preferences based on learning styles dimensions. The Turkish Online Journal of Educational Technology, 9 (4), 83–93.

Naqeeb, H. (2011). Learning Styles as Perceived by Learners of English as a Foreign Language in the English Language Center of The Arab American University—Jenin. Palestine. an Najah Journal of Research, 25 , 2232.

Nkomo, L. M., Daniel, B. K., & Butson, R. J. (2021). Synthesis of student engagement with digital technologies: a systematic review of the literature. International Journal of Educational Technology in Higher Education . https://doi.org/10.1186/s41239-021-00270-1 .

Normadhi, N. B., Shuib, L., Nasir, H. N. M., Bimba, A., Idris, N., & Balakrishnan, V. (2019). Identification of personal traits in adaptive learning environment: Systematic literature review. Computers & Education, 130 , 168–190. https://doi.org/10.1016/j.compedu.2018.11.005 .

Nuankaew, P., Nuankaew, W., Phanniphong, K., Imwut, S., & Bussaman, S. (2019). Students model in different learning styles of academic achievement at the University of Phayao, Thailand. International Journal of Emerging Technologies in Learning (iJET)., 14 , 133. https://doi.org/10.3991/ijet.v14i12.10352 .

Oxman, S. & Wong, W. (2014). White Paper: Adaptive Learning Systems. DV X Innovations DeVry Education Group. Retrieved December 14, 2020 from shorturl.at/hnsS8 .

Ozyurt, Ö., & Ozyurt, H. (2015). Learning style-based individualized adaptive e-learning environments: Content analysis of the articles published from 2005 to 2014. Computers in Human Behavior, 52 , 349–358. https://doi.org/10.1016/j.chb.2015.06.020 .

Pardo, A., Han, F., & Ellis, R. (2016). Exploring the relation between self-regulation, online activities, and academic performance: a case study. Proceedings of Sixth International Conference on Learning Analytics & Knowledge , (pp. 422-429). https://doi.org/10.1145/2883851.2883883 .

Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles: concepts and evidence. Psychology Faculty Publications., 9 (3), 105–119. https://doi.org/10.1111/j.1539-6053.2009.01038.x .

Qazdar, A., Cherkaoui, C., Er-Raha, B., & Mammass, D. (2015). AeLF: Mixing adaptive learning system with learning management system. International Journal of Computer Applications., 119 , 1–8. https://doi.org/10.5120/21140-4171 .

Robinson, C., & Hullinger, H. (2008). New benchmarks in higher education: Student engagement in online learning. Journal of Education for Business, 84 , 101–109.

Rogers-Stacy, C., Weister, T., & Lauer, S. (2017). Nonverbal immediacy behaviors and online student engagement: Bringing past instructional research into the present virtual classroom. Communication Education, 66 (1), 37–53.

Roy, S., & Roy, D. (2011). Adaptive e-learning system: a review. International Journal of Computer Trends and Technology (IJCTT), 1 (1), 78–81. ISSN:2231-2803.

Shi, L., Cristea, A., Foss, J., Qudah, D., & Qaffas, A. (2013). A social personalized adaptive e-learning environment: a case study in topolor. IADIS International Journal on WWW/Internet., 11 , 13–34.

Shih, M., Feng, J., & Tsai, C. (2008). Research and trends in the field of e-learning from 2001 to 2005: A content analysis of cognitive studies in selected journals. Computers & Education, 51 (2), 955–967. https://doi.org/10.1016/j.compedu.2007.10.004 .

Silva, A. (2020). Towards a Fuzzy Questionnaire of Felder and Solomon for determining learning styles without dichotomic in the answers. Journal of Learning Styles, 13 (15), 146–166.

Staikopoulos, A., Keeffe, I., Yousuf, B. et al., (2015). Enhancing student engagement through personalized motivations. Proceedings of IEEE 15th International Conference on Advanced Learning Technologies , (pp. 340–344), Taiwan: Hualien. https://doi.org/10.1109/ICALT.2015.116 .

Surjono, H. D. (2014). The evaluation of Moodle-based adaptive e-learning system. International Journal of Information and Education Technology, 4 (1), 89–92. https://doi.org/10.7763/IJIET.2014.V4.375 .

Truong, H. (2016). Integrating learning styles and adaptive e-learning system: current developments, problems, and opportunities. Computers in Human Behavior, 55 (2016), 1185–1193. https://doi.org/10.1016/j.chb.2015.02.014 .

Umm Al-Qura University Agency for Educational Affairs (2020). Common first-year Deanship, at Umm Al-Qura University. Retrieved February 3, 2020 from https://uqu.edu.sa/en/pre-edu/70021 .

Vassileva, D. (2012). Adaptive e-learning content design and delivery based on learning style and knowledge level. Serdica Journal of Computing, 6 , 207–252.

Veiga, F., Robu, V., Appleton, J., Festas, I & Galvao, D. (2014). Students' engagement in school: Analysis according to self-concept and grade level. Proceedings of EDULEARN14 Conference 7th-9th July 2014 (pp. 7476-7484). Barcelona, Spain. Available Online at: http://hdl.handle.net/10451/12044 .

Velázquez, A., & Assar, S. (2009). Student learning styles adaptation method based on teaching strategies and electronic media. Educational Technology & SocieTy., 12 , 15–29.

Verdú, E., Regueras, L., & De Castro, J. (2008). An analysis of the research on adaptive Learning: The next generation of e-learning. WSEAS Transactions on Information Science and Applications, 6 (5), 859–868.

Willingham, D., Hughes, E., & Dobolyi, D. (2015). The scientific status of learning styles theories. Teaching of Psychology., 42 (3), 266–271. https://doi.org/10.1177/0098628315589505 .

Yalcinalp & Avcı. (2019). Creativity and emerging digital educational technologies: A systematic review. The Turkish Online Journal of Educational Technology, 18 (3), 25–45.

Yang, J., Huang, R., & Li, Y. (2013). Optimizing classroom environment to support technology enhanced learning. In A. Holzinger & G. Pasi (Eds.), Human-computer interaction and knowledge discovery in complex (pp. 275–284). Berlin: Springer.

Zhang, H. (2017). Accommodating different learning styles in the teaching of economics: with emphasis on fleming and mills¡¯s sensory-based learning style typology. Applied Economics and Finance, 4 (1), 72–78.

Download references

Acknowledgements

The author would like to thank the Deanship of Scientific Research at Umm Al-Qura University for the continuous support. This work was supported financially by the Deanship of Scientific Research at Umm Al-Qura University to Dr.: Hassan Abd El-Aziz El-Sabagh. (Grant Code: 18-EDU-1-01-0001).

Author information

Hassan A. El-Sabagh is an assistant professor in the E-Learning Deanship and head of the Instructional Programs Department, Umm Al-Qura University, Saudi Arabia, where he has worked since 2012. He has extensive experience in the field of e-learning and educational technologies, having served primarily at the Educational Technology Department of the Faculty of Specific Education, Mansoura University, Egypt since 1997. In 2011, he earned a Ph.D. in Educational Technology from Dresden University of Technology, Germany. He has over 14 papers published in international journals/conference proceedings, as well as serving as a peer reviewer in several international journals. His current research interests include eLearning Environments Design, Online Learning; LMS-based Interactive Tools, Augmented Reality, Design Personalized & Adaptive Learning Environments, and Digital Education, Quality & Online Courses Design, and Security issues of eLearning Environments. (E-mail: [email protected]; [email protected]).

Authors and Affiliations

E-Learning Deanship, Umm Al-Qura University, Mecca, Saudi Arabia

Hassan A. El-Sabagh

Faculty of Specific Education, Mansoura University, Mansoura, Egypt

You can also search for this author in PubMed   Google Scholar

Contributions

The author read and approved the final manuscript.

Corresponding author

Correspondence to Hassan A. El-Sabagh .

Ethics declarations

Competing interests.

The author declares that there is no conflict of interest

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

El-Sabagh, H.A. Adaptive e-learning environment based on learning styles and its impact on development students' engagement. Int J Educ Technol High Educ 18 , 53 (2021). https://doi.org/10.1186/s41239-021-00289-4

Download citation

Received : 24 May 2021

Accepted : 19 July 2021

Published : 01 October 2021

DOI : https://doi.org/10.1186/s41239-021-00289-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Adaptive e-Learning
  • Learning style
  • Learning impact

learning environment research papers

  • Research article
  • Open access
  • Published: 21 October 2019

Perception of the learning environment among the students in a nursing college in Eastern Nepal

  • Erina Shrestha 1 ,
  • Ram Sharan Mehta 1 ,
  • Gayanand Mandal 1 ,
  • Kriti Chaudhary 2 &
  • Nirmala Pradhan 3  

BMC Medical Education volume  19 , Article number:  382 ( 2019 ) Cite this article

44k Accesses

28 Citations

Metrics details

Learning environment is an important base for learning processes of students and for preferences of future workplaces. It is considered as an essential factor in determining the success of an effective curriculum and the students’ academic achievements. This study attempts to assess the perception of learning environment among the nursing students.

A descriptive cross-sectional study design was used to conduct the study among 122 nursing students studying at B.P. Koirala Institute of Health Science. Data were collected following total enumerative sampling method using a self-administered questionnaire. Dundee Ready Educational Environment Measure (DREEM) inventory tool was used to assess the perception of learning environment. Descriptive statistics (frequency, percentage, mean and standard deviation) was used to describe the demographic and other related variables. One way Analysis of Variance (ANOVA) was used to find out the difference in the overall scale score and its subscales across the selected socio-demographic variables (age, ethnicity, residence, year of enrollment) of the respondents.

The mean age of the students was 21 ± 1.46 years. Majority of the students were from Province no. 1 (57.4%) and largely from Sunsari district (25.4%). First year students were found to be more satisfied (68.23%) with the educational environment (136.45 ± 16.93) compared to student of other years. Academic self-perception (21.94 ± 3.42) was the highest scoring subscale (68.57%) while the social self-perception (16.43 ± 2.96) was the lowest (58.66%). The overall DREEM score (131.25 ± 15.82 out of 200) indicated that perception of learning environment among the students was positive. Despite overall positive perception, students perceived that the teachers were authoritative and there is lack of good support system for the students at the time of stress. The total DREEM score varied significantly between the years of enrollment ( p  < 0.05).

The current study showed positive perception of learning environment which varied significantly according to the year of enrollment. However improvements are required across all the five domains for the high quality educational environment. Future qualitative studies are recommended to confirm and to have in-depth understanding of this finding.

Peer Review reports

Educational environment is one of the most important factor in determining the success of an effective curriculum and subsequently the students’ academic success [ 1 ]. The quality of educational climate impacts the quality of the curriculum, teaching and learning consideration and developing student outcomes as practitioners [ 2 ]. Bloom described the educational or learning environment concept as “the conditions, external stimuli and forces which may be physical, social, as well as intellectual forces which challenge on the individual and influence students’ learning outcomes” [ 3 ].

A good or effective learning environment is not limited to only teacher’s good communication skills, knowledge, credibility and preparedness contributing towards teaching excellence. An environment that best prepares the students for their future professional life and contributes towards their personal and psychosomatic development as well as the social well-being is considered as an ideal academic environment [ 1 ].

As cited by Sayed and El-Sayed [ 4 ], Jiffry et al. indicated that the major domains encompassed in an educational environment of any health school are self-perception of learning, self-perception of teachers, academic self-perception, self-perception of atmosphere, and social self-perception.

Roff and McAleer have indicated that environment that is competitive, authoritarian, stressful, or threatening may de-motivate students and weaken their interest and commitment for learning process. Environment that is collaborative, collegial, and supportive may enhance greater engagement of nursing students and this may lead to improved preparedness for clinical training [ 4 ].

Previous study showed that students’ perception of their current learning environment is even a stronger predictor of learning outcomes at a university than their prior achievements at school [ 5 ]. Mayya & Roff [ 6 ] had found significant differences in the students’ perceptions of learning environment between academic achievers and under- achievers.

It is evident from recent literatures that the educational environment encountered by students has a significant impact on their behavior, satisfaction with the course of study, perceived well- being, aspirations and academic achievement [ 7 , 8 ].

Studying the learning environment is important in improving the quality of an educational program [ 3 ]. The most used and accessible way of examining the educational environment is to evaluate the students’ perception of that environment [ 9 ].

Systematic review conducted by Miles et al. [ 8 ] showed that perception of the learning environment using Dundee Ready Educational Environment Measure (DREEM) has been conducted in at least 20 countries around the world. However these studies are predominantly conducted among the medical undergraduate students. Study in this regard in developing countries such as Nepal is limited, especially among nursing students.

Many methods have been tried to obtain such a reading and these include questionnaire tools, focus group studies, student feedback etc. Among them, the DREEM questionnaire is said to be one of the widely used and more specific tools in relation to assessing educational environment, especially in relation to medical education. DREEM has been validated as a universal diagnostic inventory for assessing the quality of educational environment [ 4 , 5 , 8 ].

Apart from this, the findings from the DREEM have been found to be consistent with qualitative information attained via interviews [ 10 ]. Secondly even though the DREEM have been used mainly for medical students, it was constructed by a panel of faculty from not only the medicals schools but also the other health professions and items were constructed based on their perceptions of learning climates conducive to education in the health profession, not just medicine [ 8 ].

The main objective of the study was to assess the perception of learning environment among the nursing students and to find out the difference in the overall score of perception of learning environment and its subscales across the selected variables.

Study design and setting

A descriptive cross-sectional design was employed for this study. The study was carried out among the B.Sc. Nursing students studying at B.P. Koirala Institute of Health Science (BPKIHS), tertiary level medical university in eastern Nepal.

Participants

Required sample for this study was estimated using the formula, n  = z 2 σ 2 /d 2 . Considering the study conducted by Kohli and Dhaliwal [ 11 ], where, Mean = 101.13, Standard deviation, SD ( σ ) = 21.14, Absolute precision (d) = 4.04 (4% of mean) Z 5%  = 1.96 and adding 10% for nonresponse, the final sample size was 115. The total students currently enrolled in the program were 128; hence all the students were enrolled in the study following the total enumerative sampling method. Students who were currently enrolled in the B.Sc. Nursing program present at the time of data collection and who gave consent were included in the study.

Data were collected using a self-administered questionnaire based on the objectives of the research which consisted of two sections:

Section A: socio-demographic characteristics of the students (age, ethnicity, residence and year of enrollment).

Section B: items related to perception of learning environment based on Dundee Ready Education Environment Measure (DREEM).

DREEM is an internationally validated, non-culturally specific inventory. It includes 50 items with five point Likert scale (0–4). These items are categorized in five sub scales as Student’s Perception of Learning (12 items), Student’s Perception of Teachers (11 items), Student’s Academic Self Perception (8 items), Student’s Perception of Atmosphere (12 items), Student’s Social Self Perception (7 items). There are nine negative items (items 4, 8, 9, 17, 25, 35, 39, 48, and 50), for which correction is made by reversing the scores; thus after correction, higher scores indicate disagreement with that item. Each individual item with a mean score of ≥3.5 are true positive points; those with a mean of ≤2 are problem areas; scores in between these two limits indicate aspects of the environment that could be enhanced. The maximum global score for the questionnaire is 200 which is interpreted as follows: 0–50 = very poor, 51–100 = many problems, 101–150 = more positive than negative, 151–200 = excellent [ 8 , 11 , 12 , 13 , 14 ]. The alpha coefficient of the tool for this study was 0.86 which indicates adequate reliability for measurement.

The research instrument was pretested to identify any ambiguities in the questionnaire. It was performed by taking 10% of the sample size i.e. 12 meeting the inclusion criteria in a different nursing college and those samples were not included in the main study.

Ethical consideration

Ethical clearance was obtained from the Institutional Review Committee (IRC) BPKIHS and approval letter was provided by the research Committee, BPKIHS. Permission from the concerned authority was obtained to conduct the study. Permission was obtained for the use of DREEM inventory. Written informed consent was obtained from respondents prior to the data collection.

Data collection

English version of the DREEM questionnaire was used. The questionnaire was distributed to students of each year separately at the end of the year in a leisure class. Before the questionnaire was distributed the students were briefed about the purpose of the study, data collection procedure and the meaning of some terms such as authoritarian, ridicule, factual learning which were found difficult by the students during pretesting. Researcher was present during the data collection and precautions were taken to ensure that the students didn’t copy answer from their friends’ questionnaires. Around 30 min was taken by each participant to complete the questionnaire. Total 122 students were present at the time of data collection who completed the questionnaire which were analyzed.

Statistical analysis

The data were collected, coded, checked for completeness and entered in Microsoft EXCEL 2007 and transformed in SPSS (Statistical Package for Social Sciences) PC 11.5.0 version. Descriptive statistics (frequency, percentage, mean and standard deviation) was used to describe the demographic and other related variables. One way Analysis of Variance (ANOVA) was used to find out the difference in the score of the perception of learning environment and its subscales across the selected socio-demographic variables (age, ethnicity, residence and year of enrollment).

Around two-third (66.39%) of the respondents belonged to the age group 20–22 years. The age of the respondents ranged from 18 to 25 years with the mean age of 21 years±1.46. Around one fourth of the respondents were Brahmin (26.2%) and Mongolian (25.4%) each. Out of total students from first to fourth year, only around one fifth (19.7%) of the respondents were from second year. Students from all over Nepal were currently studying in the B.Sc. nursing program. More than half (57.4%) of the respondents were from Province no. 1. Around one fourth (25.4%) of the respondents were from Sunsari district. Almost half (47.6%) of the respondents were from the other 28 districts altogether. As BPKIHS is located in Sunsari district of Province no. 1, the students near to this area may be more interested in studying here.

The overall score for perception of learning environment was 131.25 ± 15.82 (65.62% of maximum score). Among the five subscales, Student’s Academic self-perception was the highest scoring subscale (68.57%) with the mean score of 21.94 ± 3.42 while the social self-perception was the lowest (58.66%) with the mean score of 16.43 ± 2.96. Subscales means and standard deviation along with percentage are depicted in Table 3 .

On individual item analysis, the mean of individual items revealed that ten items scored three or more than three which indicated the positive aspect. However none of the items scored more than 3.5. Scores of six items were below two. The mean scores of the majority of the items were between two and three.

Significant differences were found between the years of enrollment for the overall perception of the learning environment ( p  < 0.05). The score for subscale student’s perception of teachers also varied significantly between the years of enrollment ( p  < 0.05). There was no significant difference in the score of the perception of the learning across the other selected variables. The details of the result are depicted in the Tables 1 , 2 , 3 and 4 .

The overall DREEM mean score obtained in this study indicated that the student’s perception of learning environment was positive. Though this score highlighted the student centered approach followed in the nursing program, improvements are needed to have a positive impact on the student’s achievement, satisfaction and success [ 15 ]. Study conducted by Roff et al. [ 12 ] in Nepalese Health Profession Institution showed similar overall DREEM score of 130. Similar to this study, positive perception of learning environment was seen in several other studies conducted in various Nursing colleges of Malaysia [ 3 ], Suadi Arabia [ 4 ] and Iran [ 9 ]. Similar finding was seen in studies conducted by Arab [ 16 ], Imanipour [ 17 ], Bakshi [ 18 ] and Victor [ 19 ] that showed the overall DREEM score of 103.54, 104.39, 114.3 and 119 respectively. A study conducted among medical and nursing undergraduate students of Srilanka also showed positive perception [ 20 ]. Similar studies were predominantly conducted in medical schools of India [ 1 , 11 ], Pakistan [ 21 ], Malaysia [ 22 ], Saudi Arabia [ 23 ], Iran [ 24 ], Egypt [ 25 ], Australia [ 15 ], Brazil [ 26 ], Sweden [ 14 ] and the findings from those studies were in agreement with that of the current study.

In contrast to the present finding, a study conducted in Egypt [ 7 ] among the nursing students showed poor perception towards their learning environment. Contrast result to the present finding was also seen in study conducted by Al-Ayed [ 27 ] in a medical college in Riyadh with the overall DREEM score of 89.9. Studies conducted by Aghamolaei [ 28 ] and Taheri [ 2 ] in medical schools of Iran also reported potential problems with the total DREEM score of 99.6, 98 respectively. However no studies were found by the researcher in which the perception of the learning environment was excellent. Variation in population and setting studied might be the reason for variation in score.

In the present study, the interpretation of each five subscales of DREEM revealed a perception which was directed more towards the positive side. Similar finding was seen in the study conducted by Said [ 3 ], Sayed [ 4 ], Mayya [ 6 ], Farajpour [ 9 ], Bakshi [ 24 ] and Sajid [ 29 ].

On individual item analysis ten items scored three or more than three which indicated the positive aspect. The students perceived that there are opportunities to develop their interpersonal skills, the teachers are knowledgeable and good at providing feedback, the teaching helped to develop their competence and confidence. The students are confident about passing the exam. They have good friends and are comfortable socially. These all findings revealed the liberal atmosphere present in the institution.

According to Mayya and Roff [ 6 ], items scoring more than 3.5 are the excellent areas. However in the present study none of the items scored more than 3.5 which indicated that there were no particularly excellent areas and lot of areas needed improvement. This finding is consistent with the findings of the study conducted by Abussad [ 30 ].

Six items scored two or less than two. These low scores are the areas of concern. The student identified that increased tiredness and stress to enjoy the course, lack of good stress support system, factual learning and authoritarian teachers as significant problems. The impression that the teachers are authoritarian and emphasized on factual learning has also been shown by other studies conducted in India [ 1 ] and Iran [ 5 ]. Perceived lack of good support system is also seen in a study conducted in Iran by Aghamolaei [ 28 ] and Imanipour [ 17 ]. For a positive academic environment, the overall well-being of the students’ needs to be taken into consideration in terms of workload maintaining a balance between the academic activities and the recreation time [ 1 ] so that they can enjoy the course. The findings from the present study indicate that the support system provided by the faculty and the institution should be improved in order to facilitate the learning of the students. As most of the students are not locals and need to stay away from their parents and guardians, the students should be made aware of the available support system such as emphasizing the role of the preceptors who are accessible in the institution to help the students. It also focuses on to improvise the student centered approach which is followed in the nursing program.

Significant difference was found between the perception of learning environment and the year of enrollment. This finding is in accordance to the findings given by Roff [ 12 ], Said [ 3 ], Brown [ 15 ], Bakshi [ 18 ] and Bakshi [ 24 ]. However no any significant difference was found between the perception of learning environment and the other selected socio-demographic variables.

Out of total students, first year students had the highest mean score for the perception followed by third year students, second year students and the fourth year students respectively. This trend was fairly consistent among the different subscales where first year students scored more than the other years. This might be explained by the fact that first year students are not exposed to all the areas and are not too stressed by the study. The positive perception among the newcomers might have been fueled by the excitement of getting admissions in one of the reputed institution of the nation. A study conducted by Said [ 3 ] in Malaysia among the nursing students also revealed highest DREEM score among the first year students. Similar finding was seen in a study conducted by Bakshi [ 18 ] and Al-Ayed [ 27 ] among the nursing students in Iran and Suadi Arabia. However a study conducted by Roff et al. [ 12 ] in Nepalese Health profession institutions showed improved perception in the year 2 and 3 over year 1 as reflected by the DREEM scores across the years. Study conducted by Bakshi [ 24 ] in a medical college in Iran also revealed higher scores for the year second and fourth.

This study provides the valuable insight regarding the educational environment as perceived by the nursing students. The study is conducted in a single institution among limited participants. So it would be important to conduct the similar studies in other nursing institutions across the country for establishing the generalizability of the findings. Qualitative studies are recommended to have in-depth understanding of the findings, address the specific problems and highlight the strength of the particular learning environment.

In conclusion students in this institution revealed a positive perception of the learning environment, which varied significantly according to the year of enrollment. However improvements are required across all the five domains for the high quality educational environment.

Availability of data and materials

The datasets generated and/or analysed during the current study are available from the corresponding author on reasonable request.

Abbreviations

One Way Analysis of Variance

B. P. Koirala Institute of Health Science

Dundee Ready Educational Environment Measure

Institutional Review Committee

Standard Deviation

Statistical Package for Social Sciences

Tripathy S, Dudani S. Students’ perception of the learning environment in a new medical college by means of the DREEM inventory. Int J Res Med Sci. 2013;1(4):385–91.

Article   Google Scholar  

Taheri M. Students' perceptions of learning environment in Guilan University of Medical Sciences. JME. 2009;13(4):127–33.

Google Scholar  

Md SN, Rogayah J. Hafiazah a. a study of learning environments in the Kulliyyah (faculty) of nursing, International Islamic University Malaysia. Malays J Med Sci. 2009;16(4):15–24.

Sayed HY, El-Sayed NG. Students' perceptions of the educational environment of the nursing program in Faculty of Applied Medical Sciences at umm Al Qura University, KSA. J Ame Sci. 2012;8(4):69–75.

Bakshialiabad H, Bakhsh MH, Hassanshahi G. Students’ perceptions of the academic learning environment in seven medical sciences courses based on DREEM. Adv Med Educ Pract. 2015;6:195–203.

Mayya S, Roff S. Students’ perceptions of educational environment: a comparison of academic achievers and under-achievers at Kasturba medical college, India. Educ Health. 2004;17(3):280–91.

Sharkawy SA, El-Houfey AA, Hassan AK. Students' perceptions of educational environment in the faculties of nursing at Assiut, Sohag and South Valley universities. Ass Univ Bull Environ Res. 2013;16(2):167–97.

Miles S, Swift L, Leinster SJ, et al. Med Teach. 2012;34(9):e620–34.

Farajpour A, Esaashari FF, Hejazi M, Meshkat M. Survey of midwifery students’ perception of the educational environment based on the DREEM model at Islamic Azad University of Mashhad in the academic year 2014. Res Dev Med Educ. 2014;4(1):41–5.

Denz-Penhey H, Murdoch C. A comparison between findings from the DREEM questionnaire and that from qualitative reviews. Med Teach. 2009;31:449–53.

Kohli V, Dhaliwal U. Medical students’ perception of the educational environment in a medical college in India: a cross-sectional study using the Dundee Ready Education Environment questionnaire. J Educ Eval Health Prof. 2013;10(5). https://doi.org/10.3352/jeehp.2013.10.5 .

Roff S, McAleer S, Ifere OS, Bhattacharya S. A global diagnostic tool for measuring educational environment: comparing Nigeria and Nepal. Med Teach. 2001;23(4):378–82.

Roff S. The Dundee ready educational environment measure (DREEM)- a generic instrument for measuring perceptions of undergraduate health professions curricula. Med Teach. 2005;27(4):322–5.

Edgren G, Haffling AC, Jakobsson U, Mcaleer S, Danielsen N. Comparing the educational environment (as measured by DREEM) at two different stages of curriculum reform. Medical Teacher. 2010;32(6):e233–8. https://doi.org/10.3109/01421591003706282 .

Brown T, Williams B, Lyuch M. The Australian DREEM: evaluating students perceptions of academic learning environments within eight health science courses. Int J Med Educ. 2011;2:94–101.

Arab M, Rafiei H, Safarizadeh MH, Shojaei M, Safarizadeh MM. Nursing and Midwifery Students Perception Of Educational Environment: A Cross Sectional Study In Iran. IOSR-JNHS. 2016;5(1):64–7.

Imanipour M, Sadoghiasi A, Ghiyasavandian S, Haghani H. Evaluating the Educational Environment of a Nursing School by Using the DREEM Inventory. Glob J Health Sci. 2015;7(4). https://doi.org/10.5539/gjhs.v7n4p211 .

Bakshi H, Abazari F, Bakhshialiabad M. Nursing Students' perceptions of their educational environment based on DREEM model in an Iranian University. Malays J Med Sci. 2013;20(4):56–63.

Victor G, Ishtiaq M, Parveen S. Nursing students’ perceptions of their educational environment in the bachelor’s programs of the Shifa College of nursing. Pakistan J Educ Eval Health Prof. 2016;13:43.

Youhasan P, Sathananthan T. Educational environment for undergraduate medicine and nursing Programme at Eastern University, Sri Lanka. Stud Percept OUSL J. 2016;11:23–35.

Imran N, Khalid F, Haider II, Jawaid M, Irfan M, Mahmood A, et al. Student’s perceptions of educational environment across multiple undergraduate medical institutions in Pakistan using DREEM inventory. J Pak Med Assoc. 2015;6(1):24–8.

Ugusman A, Othman NA, Razak ZNA, Soh MM, Faizul PNAK, Ibrahim SF. Assessment of learning environment among the first year Malaysian medical students. J Taibah Univ Med Sci. 2015;10(4):454–60. https://doi.org/10.1016/j.jtumed.2015.06.001 .

Al Sheikh MH. Educational environment measurement, how is it affected by educational strategy in a Saudi medical school? A multivariate analysis. J Taibah Univ Med Sci. 2014;9(2):115–22. https://doi.org/10.1016/j.jtumed.2013.11.005 .

Bakshi H, Bakshialiabad MH, Hassanshahi G. Students' perceptions of the educational environment in an Iranian Medical School, as measured by The Dundee Ready Education Environment Measure. Bangladesh Med Res Counc Bull. 2014;40:36–41. https://doi.org/10.3329/bmrcb.v40i1.20335 .

Youssef WT, Wazir YME, Ghaly MS, Khadragy R. Evaluation of the Learning Environment at the Faculty of Medicine, Suez Canal University: Students’ Perceptions. Intel Prop Rights. 2013;1(102). https://doi.org/10.4172/ipr.1000102 2013.

Enns SC, Perotta B, Paro HB, Gannam S, Peleias M, Mayer FB et al . Medical students’ perception of their educational environment and quality of life: is there a positive association? Acad med. Vol. XX, No. X / XX XXXX. https://doi.org/10.1097/ACM.0000000000000952 .

Al-Ayed IH, Sheik SA. Assessment of the educational environment at the College of Medicine of King Saud University, Riyadh. East Mediterr Health J. 2008;14(4):953–9.

Aghamolaei T, Fazel I. Medical students’ perceptions of the educational environment at an Iranian Medical Sciences University. BMC Med Educ. 2010;10:87.

Sajid F, Rehman A, Fatima S. Perceptions of Students of the Learning Environment Studying An Integrated Medical Curriculum. J Surg Pak. 2013;18(2):86–91.

Abusaad FES, Mohamed HES, El-Gilany AH. Nursing students’ perceptions of the educational learning environment in pediatric and maternity courses using DREEM questionnaire. J Educ Pract. 2015;6(29):26–32.

Download references

Acknowledgements

I would like to thank Prof. Dr. Surya Raj Niraula for statistical advice and Ms. Ruby Ghimire for her support during pretesting. Similarly, I would like to extend sincere gratitude to Ms. Diksha Sapkota and Ms. Rita Pokharel for their support and encouragement during the study. I would also like to extend my humble thanks to all the participants without whom this project wouldn’t have been completed.

This is a non-funded research.

Author information

Authors and affiliations.

Department of Medical Surgical Nursing, B.P. Koirala Institute of Health Sciences, Dharan, Nepal

Erina Shrestha, Ram Sharan Mehta & Gayanand Mandal

Department of Maternal Health Nursing, B.P. Koirala Institute of Health Sciences, Dharan, Nepal

Kriti Chaudhary

Department of Psychiatric Nursing, B.P. Koirala Institute of Health Sciences, Dharan, Nepal

Nirmala Pradhan

You can also search for this author in PubMed   Google Scholar

Contributions

ES developed the original idea of the study. ES and GM were involved in designing the protocol. RSM and KC provided the feedback to the protocol. ES, KC and NP collected the data. Preliminary analyses were done by ES which was then reviewed by RSM and GM. ES prepared the final draft of the manuscript. All the authors have reviewed and approved the final manuscript.

Corresponding author

Correspondence to Erina Shrestha .

Ethics declarations

Ethics approval and consent to participate.

Ethical approval was obtained from Institutional Review Committee (IRC), B.P. Koirala Institute of Health Sciences.

The IRC reference number is 391/073/074-IRC.

Written informed consent was obtained from the participant’s prior to the study.

Consent for publication

Not Applicable

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Shrestha, E., Mehta, R.S., Mandal, G. et al. Perception of the learning environment among the students in a nursing college in Eastern Nepal. BMC Med Educ 19 , 382 (2019). https://doi.org/10.1186/s12909-019-1835-0

Download citation

Received : 12 March 2019

Accepted : 03 October 2019

Published : 21 October 2019

DOI : https://doi.org/10.1186/s12909-019-1835-0

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Learning environment

BMC Medical Education

ISSN: 1472-6920

learning environment research papers

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Wiley-Blackwell Online Open

Logo of blackwellopen

What really matters for successful research environments? A realist synthesis

Rola ajjawi.

1 Centre for Research in Assessment and Digital Learning (CRADLE), Deakin University, Geelong, Victoria, Australia

Paul E S Crampton

2 Research Department of Medical Education, University College London, London, UK

3 Monash Centre for Scholarship in Health Education (MCSHE), Faculty of Medicine, Nursing and Health Sciences, Monash University, Clayton, Victoria, Australia

Charlotte E Rees

Associated data.

Table S2. MeSH terms and a selection of key terms utilised in the database searches.

Table S3. Inclusion and exclusion criteria with respect to topic, recentness and type of article.

Table S4. Refined inclusion and exclusion criteria to include contextual parameters.

Table S5. Studies by type: qualitative, quantitative and mixed‐methods.

Research environments, or cultures, are thought to be the most influential predictors of research productivity. Although several narrative and systematic reviews have begun to identify the characteristics of research‐favourable environments, these reviews have ignored the contextual complexities and multiplicity of environmental characteristics.

The current synthesis adopts a realist approach to explore what interventions work for whom and under what circumstances.

We conducted a realist synthesis of the international literature in medical education, education and medicine from 1992 to 2016, following five stages: (i) clarifying the scope; (ii) searching for evidence; (iii) assessing quality; (iv) extracting data, and (v) synthesising data.

We identified numerous interventions relating to research strategy, people, income, infrastructure and facilities (IIF), and collaboration. These interventions resulted in positive or negative outcomes depending on the context and mechanisms fired. We identified diverse contexts at the individual and institutional levels, but found that disciplinary contexts were less influential. There were a multiplicity of positive and negative mechanisms, along with three cross‐cutting mechanisms that regularly intersected: time; identity, and relationships. Outcomes varied widely and included both positive and negative outcomes across subjective (e.g. researcher identity) and objective (e.g. research quantity and quality) domains.

Conclusions

The interplay among mechanisms and contexts is central to understanding the outcomes of specific interventions, bringing novel insights to the literature. Researchers, research leaders and research organisations should prioritise the protection of time for research, enculturate researcher identities, and develop collaborative relationships to better foster successful research environments. Future research should further explore the interplay among time, identity and relationships.

Short abstract

This realist review shows when and why interventions related to research strategy; people; income, infrastructure and facilities; and collaboration result in positive or negative research environments. Findings indicate that protected time, researcher identities and collaborative relationships are important for fostering successful research environments.

Introduction

Research environments matter. Environmental considerations such as robust cultures of research quality and support for researchers are thought to be the most influential predictors of research productivity. 1 , 2 Over 25 years ago, Bland and Ruffin 1 identified 12 characteristics of research‐favourable environments in the international academic medicine literature spanning the period from the mid‐1960s to 1990 (Box 1 ). Although these characteristics are aspirational in flavour, how they interplay to influence research productivity within increasingly complex institutional structures is not yet known. Indeed, although existing reviews have begun to help us better understand what makes for successful research environments, this research has typically ignored the contextual complexities and multiplicity of environmental characteristics 1 , 3 , 4 , 5 , 6 , 7 and has focused on narrow markers of productivity such as the quantity of research outputs (e.g. ref. 7 ) The current realist synthesis, therefore, aims to address this gap in the research literature by reviewing more recent literature ( 1992–2016 ) and exploring the features of successful research environments in terms of which interventions work, for whom, how and in what circumstances.

Characteristics of successful research environments 1

  • Clear organisational research goals
  • Research productivity as a priority and at least equal priority to other activities
  • A robust research culture with shared research values
  • A positive group climate
  • Participative governance structures
  • Non‐hierarchical and decentralised structures
  • Good communication and professionally meaningful relationships between team members
  • Decent resources such as people, funding, research facilities and time
  • Larger group size, moderately established teams and diversity
  • Rewards for research success
  • Recruitment and selection of talented researchers
  • Research‐oriented leaders with research expertise and skill

The contextual background for understanding successful research environments

Against a backdrop of the mass production of education, reduced government funding for research and ‘new managerialist’ cultures in higher education, 8 , 9 increased scrutiny of the quantity and quality of research, the research environments in which research is produced and the impacts of research has become inevitable. 10 Indeed, in higher education institutions (HEIs) globally, research productivity is being measured as part of individual researcher and research group key performance indicators. 7 In many countries, such as Australia, Hong Kong, New Zealand and the UK, 11 HEI research is measured on a national scale through government‐led research assessments. Such research measurement has contributed to the allocation of funding to universities and differentiation of universities in the competitive marketplace, with some solidifying their institutional identities as ‘research‐intensive’ and others emphasising their relative ‘newcomer‐to‐research’ status (e.g. previously ‘teaching‐intensive’ universities). 9 , 12 , 13 Such institutional differentiation also parallels that of individual academics within universities, who are increasingly encouraged to take either ‘research‐active’ or ‘education‐focused’ career pathways. 8 , 9 It is these broader national and institutional constraints that inevitably impact on research environments at the level of units, centres, departments and schools within universities (the level of ‘research environment’ that we focus on in this paper). Table S1 provides definitions of key terms.

Key features of research environments identified in previous reviews

Evans defines a research environment as including: ‘shared values, assumptions, beliefs, rituals and other forms of behaviour whose central focus is the acceptance and recognition of research practice and output as valued, worthwhile and pre‐eminent activity.’ 14 Previous reviews have tended to focus on interventions aimed at individual researchers, such as research capacity building, 4 , 5 , 7 and with individual‐level outcomes, such as increased numbers of grants or publications. 4 , 5 , 7 These reviews have typically concluded that research capacity‐building interventions lead to positive research outcomes. 4 , 5 , 7 Furthermore, the reviews have identified both individual and institutional enablers to research. Individual enablers included researchers’ intrinsic motivation to conduct research. 6 , 7 Institutional enablers included peer support, encouragement and review, 7 mentoring and collaboration, 4 , 5 research leadership, 5 , 6 institutional structures, processes and systems supporting research, such as clear strategy, 5 , 6 protected time and financial support. 5 Although these reviews have begun to shed light on the features of successful research environments, they have significant limitations: (i) they either include studies of low to moderate quality 4 , 5 or fail to check the quality of studies included, 7 and (ii) they do not explore what works for whom and under what circumstances, but instead focus on what works and ignore the influence of the context in which interventions are implemented and ‘how’ outcomes come about. Indeed, Mazmanian et al. 4 concluded in their review: ‘…little is known about what works best and in what situations.’

Conceptual framework: a realist approach

Given the gaps in the research literature and the importance of promoting successful research environments for individuals’ careers, institutional prestige and the knowledge base of the community, we thought a realist synthesis would be most likely to elucidate how multiple complex interventions can influence success. Realism assumes the existence of an external reality (a real world), but one that is filtered (i.e. perceived, interpreted and responded to) through human senses, volitions, language and culture. 15 A realist approach enables the development and testing of theory for why interventions may or may not work, for whom and under what circumstances. 16 It does this through recognising that interventions do not directly cause outcomes; instead, participants’ reactions and responses to the opportunities provided by the intervention trigger outcomes. This approach can allow researchers to identify causal links in complex situations, such as those between interventions and the contexts in which they work, how they work (mechanisms) and their outcomes. 17 Although the context–mechanism–outcome (CMO) approach is not necessarily linear, it can help to provide explanations that privilege contextual variability. 18

Aligned with the goals of realist research, this synthesis aims to address the following research question: What are the features of successful research environments, for whom, how and in what circumstances?

We followed five stages of realist synthesis: (i) clarifying scope; (ii) searching for evidence; (iii) assessing quality; (iv) extracting data, and (v) synthesising data. 19 Our methods also follow the RAMESES ( r ealist a nd m eta‐narrative e vidence s ynthesis: e volving s tandards) reporting guidelines. 20

Clarifying the scope

We first clarified the scope of our realist synthesis by identifying relevant interventions based on the Research Excellence Framework (REF) 2014 environment assessment criteria. The REF is a national exercise assessing the quality of research produced by UK HEIs, its impact beyond academia, and the environment that supports research. The assessment criteria indicated in the REF2014 environment template included the unit's research strategy , its people (including staffing strategy, staff development and research students), its income, infrastructure and facilities (IIF), as well as features of collaboration . 21 These guided our search terms (see stage 2 below). We chose to use these quality markers as they informed the UK national assessment exercise, upon which other national exercises are often based. In addition, these criteria were explicit, considered and implementable, and were developed through consensus. Like other realist syntheses, 18 , 22 , 23 ours considered a multiplicity of different interventions rather than just one and some of the papers we reviewed combined multiple interventions.

Based on previous reviews, 1 , 4 , 5 , 7 our initial programme theory speculated that interventions aligned to having an explicit research strategy, staff development opportunities, funding and establishing research networks would be effective for creating successful research environments (Fig. ​ (Fig.1 1 gives further details of our initial programme theory).

An external file that holds a picture, illustration, etc.
Object name is MEDU-52-936-g001.jpg

Initial programme theory

Searching for empirical evidence

We devised search terms as a team and refined these iteratively with the help of a health librarian experienced in searching. We split the research question into three key concepts: (i) research environment; (ii) discipline, and (iii) research indicator (i.e. positive or negative). We then used variations of these terms to search the most relevant databases including MEDLINE, ProQuest, Scopus, CINAHL (Cumulative Index to Nursing and Allied Health Literature) and Web of Science. Table S2 illustrates the MeSH terms and provides a selection of key terms utilised in the database searches.

We were interested in comparing research cultures across the disciplines of medical education, education and medicine for two key reasons. Firstly, the discipline of medical education consists of a rich tapestry of epistemological approaches including biomedical sciences, social sciences and education, and medicine. 24 , 25 Secondly, there have been disciplinary arguments in the literature about whether medical education should be constructed as medicine or social science. 24 , 26

We agreed various inclusion and exclusion criteria with respect to topic, recentness and type of article (Table S3 ), as well as refined criteria to include contextual parameters (Table S4 ). We chose 1992 as the start date for our search period as 1992 saw the first published literature review about productive research environments in the academic medicine literature. 1

Study selection

The first top‐level search elicited 8527 journal articles across all databases. Once duplicate results had been removed, and ‘topic’ and ‘recentness’ study parameters reinforced, 420 articles remained. The searching and selection process is summarised in a PRISMA ( p referred r eporting i tems for s ystematic reviews and m eta‐ a nalyses) diagram (Fig. ​ (Fig.2). 2 ). Three research assistants and one of the authors (PESC) initially assessed relevance by reviewing abstracts using preliminary inclusion criteria. If any ambiguities were found by any of the reviewers, abstracts were checked by one of the other two researchers (RA and CER). Where divergent views existed, researchers discussed the reasons why and agreed on whether to include or exclude. A 10% sample of these 420 abstracts were double‐checked by an additional two researchers, including a number of articles previously excluded, for quality control purposes.

An external file that holds a picture, illustration, etc.
Object name is MEDU-52-936-g002.jpg

PRISMA flow diagram of the selection process

Assessment of quality

We assessed the journal articles for relevance and rigour. 20 We defined an article's relevance according to ‘whether it can contribute to theory building and/or testing’. 20 Following the relevance check and ‘type’ exclusions to original research papers, 100 articles remained, which were then assessed for rigour. Although we chose to narrow down to original research, we kept relevant articles such as systematic reviews and opinion pieces to inform the introduction and discussion sections of this paper.

We defined rigour as determining ‘whether the method used to generate the particular piece of data is credible and trustworthy’. 20 We used two pre‐validated tools to assess study quality: the Medical Education Research Study Quality Instrument (MERSQI) to assess the quality of quantitative research, 27 , 28 and the Critical Appraisal Skills Programme (CASP) qualitative checklist for qualitative and mixed‐method studies. 29 Both tools are used to consider the rigour of study design, sampling, type of data, data analysis and outcomes/findings, and have been employed in previous reviews. 23 , 30

Following the quality assessment, 47 articles remained and were then subjected to data extraction and synthesis. Five papers were excluded as they did not contribute to our theory building or lacked CMO configurations (CMOCs). We kept notes of the reasons for excluding studies and resolved doubts through discussion (Fig. ​ (Fig.2 2 ).

Data extraction

Two data‐rich articles containing multiple CMOCs were inductively and deductively (based on the initial programme theory) coded by all of us to ensure consistency. We then discussed any similarities and differences in our coding. As is inherent in the challenges of realist approaches, we found differences in our identifications of CMOCs, which often related to how one particular component (e.g. time) could be an outcome at one moment and a mechanism the next. This alerted us to overlapping constructs, which we then explored as we coded remaining papers. To collect data across all remaining papers, we extracted information relating to: study design, methods and sample size; study setting; intervention focus; contexts of the intervention; mechanisms generated in the results, and outcomes. The key CMOCs in all 42 articles were identified primarily from the results sections of the papers. The process of data extraction and analysis was iterative with repeated discussion among the researchers of the demi‐regularities (i.e. patterns of CMOCs) in relation to the initial programme theory and negotiations of any differences of opinion.

Data synthesis

Finally, we interrogated our data extraction to look for patterns across our data/papers. We used an interpretative approach to consider how our data compared with our initial programme theory in order to develop our modified programme theory.

Characteristics of the studies

The 42 papers represented the following disciplines: medical education ( n = 4, 10%); 31 , 32 , 33 , 34 education ( n = 18, 43%), 35 , 36 , 37 , 38 , 39 , 40 , 41 , 42 , 43 , 44 , 45 , 46 , 47 , 48 , 49 , 50 , 51 , 52 and medicine ( n = 20, 48%). 53 , 54 , 55 , 56 , 57 , 58 , 59 , 60 , 61 , 62 , 63 , 64 , 65 , 66 , 67 , 68 , 69 , 70 , 71 , 72 There were 26 (62%) qualitative studies, 11 (26%) quantitative studies and five (12%) mixed‐methods studies (Table S5 ). The studies were from countries across the globe, including Australia ( n = 10, 24%), the USA ( n = 7, 17%), the UK ( n = 6, 14%), Canada ( n = 4, 10%), South Africa ( n = 4, 10%), Denmark ( n = 2, 5%), Turkey ( n = 2, 5%) and others ( n = 7, 17%) (e.g. Belgium, China, Germany, New Zealand and the Philippines). The research designs varied but common approaches included qualitative interviews, surveys, documentary/bibliographic analysis, case studies and mixed‐methods studies. Study participants included academics, teachers, health care professionals, senior directors, PhD students, early‐career researchers (ECRs) and senior researchers. Table S6 lists the individual contexts, interventions, mechanisms and outcomes identified from individual papers.

Extending our initial programme theory

A key finding from our realist synthesis was that the same interventions fired either positive or negative mechanisms leading to positive or negative outcomes, respectively, depending on context. Surprisingly, the CMOCs were mostly consistent across the three disciplines (i.e. medical education, education and medicine) with local contexts seemingly interplaying more strongly with outcomes. Therefore, we present these disciplinary contexts here as merged, but we highlight any differences by disciplinary context where relevant.

Having a research strategy promoted a successful research environment when it enabled appropriate resources (including time) and valuing of research; however, it had negative consequences when it too narrowly focused on outputs, incentives and rewards. In terms of people , individual researchers needed to be internally motivated and to have a sense of belonging, and protected time and access to capacity‐building activities in order to produce research. Lack of knowledge, researcher identity, networks and time, plus limited leadership support, acted as mechanisms leading to negative research outcomes. The presence of IIF was overwhelmingly indicated as necessary for successful research environments and their absence was typically detrimental. Interestingly, a few papers reported that external funding could have negative consequences because short‐term contracts, reduced job security and the use of temporary junior staff can lead to weak research environments. 40 , 67 , 71 Finally, collaboration was crucial for successful research mediated through trusting respectful relationships, supportive leadership and belongingness. Poor communication and competitive cultures, however, worked to undermine collaboration, leading to isolation and low self‐esteem, plus decreased research engagement and productivity. Table ​ Table1 1 highlights illustrative CMOCs for each intervention extending our initial programme theory.

Positive and negative context–mechanism–outcome configurations (CMOCs) for each intervention

CMOCs indicated in bold highlight the three cross‐cutting themes of time, identity and relationships.

ECRs = early‐career researchers.

Key cross‐cutting mechanisms: time, identity and relationships

As Table ​ Table1 1 shows, the same intervention can lead to positive or negative outcomes depending on the particular contexts and mechanisms triggered. This highlights greater complexity than is evident at first glance. Cross‐cutting these four interventions were three mechanisms that were regularly identified as critical to the success (or not) of a research environment: time; researcher identities, and relationships. We now present key findings for each of these cross‐cutting mechanisms and discuss how their inter‐relations lead to our modified programme theory (Fig. ​ (Fig.3). 3 ). Note that although we have tried to separate these three mechanisms for ease of reading, they were often messily entangled. Table ​ Table2 2 presents quotes illustrating the way in which each mechanism mediates outcomes within particular circumstances.

An external file that holds a picture, illustration, etc.
Object name is MEDU-52-936-g003.jpg

Modified programme theory. ECR = early‐career researcher

Time, identity and relationships as cross‐cutting mechanisms mediating successful research environments

Time was identified as an important mechanism for mobilising research outcomes across our three disciplines. Time was conceptualised severally including as: protected time; workload pressures influencing time available; efficient use of time; flexible use of time; making time, and time in career. The two most commonly considered aspects were protected time and workload implications. Protected time was largely talked about in the negative across a variety of contexts and disciplines, with lack of protected time leading to lack of researcher engagement or inactivity and reduced research productivity. 32 , 35 , 37 , 41 , 44 , 47 , 49 , 61 , 62 , 63 , 67 Also across a variety of contexts and disciplines, and acting as a positive mechanism, available protected time was found to lead to increased research productivity and active research engagement. 31 , 36 , 40 , 48 , 49 , 63 , 65 With regard to workload, limitations on the time available for research imposed by excessive other workloads led to reduced research activity, lower research productivity, poor‐quality research and reduced opportunity to attend research training. 40 , 41 , 47 , 49 , 60 , 67 Juggling of multiple responsibilities, such as clinical, teaching, administrative and leadership roles, also inhibited research productivity by diminishing the time available for research. 35 , 40 , 49 The alignment of research with other non‐research work was described as driving efficiencies in the use of time leading to greater research productivity (Table ​ (Table2, 2 , quote 1).

Identity was also an important mechanism for mobilising research outcomes across our three disciplines. Interpretations included personal identities (e.g. gender), professional identity (e.g. as a primary practitioner or a primary researcher), and social identity (e.g. sense of belongingness). Researcher identity was often referred to in relation to first‐career practitioners (and therefore second‐career researchers). Sharp et al. 48 defined these as participants recruited into higher education not directly from doctoral study but on the basis of their extensive ‘first‐order’ knowledge and pedagogical expertise. These were also practitioners conducting research in schools or hospitals. Identities were also referenced in relation to early, mid‐career or senior researchers. Academic staff working in academic institutions needed to develop a sense of researcher identity, belongingness, self‐efficacy for research and autonomy to increase their satisfaction, competence and research activity. 39 , 40 , 44 , 46 , 51 , 67 For first‐career practitioners (i.e. teachers, doctors), the research needed to be highly relevant and aligned to their primary identity work in order to motivate them. 53 , 59 , 62 , 65 This alignment was described as having a strong research–teaching nexus. 40 , 48 Linked to this concept was the need for first‐career practitioners to see the impact of research in relation to their primary work (e.g. patient‐ or student‐oriented) to facilitate motivation and to develop a researcher identity (Table ​ (Table2, 2 , quote 2). 36 , 37 , 41 , 49 , 53 , 54 , 67 Where research was seen as irrelevant to primary identity work (e.g. English language teaching, general practice), there was research disengagement. 37 , 48 , 52 , 59 , 67

Relationships

For all researchers and across our three disciplines, relationships were important in the mediating of successful research environments. 31 , 34 , 38 , 39 , 41 , 44 , 57 , 60 , 66 , 67 Positive research relationships were characterised by mutual trust and respect, 40 , 41 , 42 , 43 , 54 , 66 , 72 whereas others described them as friendships that take time to develop. 51 Mutually supportive relationships seemed to be particularly relevant to ECRs in terms of developing confidence, self‐esteem and research capacity and making identity transitions. 35 , 43 , 48 , 58 , 67 Relationships in the form of networks were considered to improve the quality of research through multicentre research and improved collaboration. 33 , 60 Supportive leadership as a particular form of relationship was an important mechanism in promoting a successful research environment. Supportive leaders needed to monitor workloads, set the vision, raise awareness of the value of research, and provide positive role‐modelling, thereby leading to increased productivity, promoting researcher identities and creating thriving research environments (Table ​ (Table2, 2 , quote 3). 31 , 34 , 37 , 38 , 40 , 41 , 43 , 44 , 46 , 48 , 49 , 53 , 55 , 62 Research leadership, however, could be influenced negatively by the context of compliance and counting in current university cultures damaging relationships, creating a loss of motivation, and raising feelings of devalue. Indeed, the failure of leaders to recognise researcher identities led to negative research productivity. 36 , 37 , 38 , 43 , 46 , 48 , 49

Intersections between time, identity and relationships within successful research environments

Time and identity.

Time and identity intersected in interesting ways. Firstly, time was a necessary enabler for the development of a researcher identity. 37 , 38 , 41 , 48 , 49 , 54 , 59 , 61 , 63 , 65 , 67 , 69 Secondly, those who identified as researchers (thus holding primary researcher identities) used their time efficiently to favour research activity outcomes despite a lack of protected time. 35 , 43 Conversely, for other professors who lacked personal determination and resilience for research, having protected time did not lead to better research activity. 43 This highlights the fact that time alone is insufficient to support a successful research environment, and that it is how time is utilised and prioritised by researchers that really matters (Table ​ (Table2, 2 , quote 4).

Identity and relationships

Interventions aimed at developing researcher identity consistently focused on relationship building across the three disciplines. The interventions that supported identity transitions into research included formal research training, 44 , 48 , 52 , 68 mentoring, 41 , 48 , 57 , 65 , 72 writing groups, 72 and collaboration with peers and other researchers, 39 , 41 , 43 operating through multiple mechanisms including relationships. The mechanisms included self‐esteem/confidence, increased networks, external recognition as a researcher, belongingness, and self‐efficacy. 35 , 41 , 43 , 44 , 45 , 52 , 57 Furthermore, our data suggest that leadership can be an enabler to the development of a researcher identity. In particular, leadership enabled research autonomy, recognition and empowerment, and fostered supportive mentoring environments, leading to researcher identity development and research productivity (Table ​ (Table2, 2 , quote 5). 34 , 38 , 46 , 48

Time and relationships

Relationships were developed and sustained over time (Table ​ (Table2, 2 , quote 6). Across the three disciplines, the role of leaders (managers, directors, deans) was to acknowledge and raise awareness of research, and then to prioritise time for research against competing demands, leading to effective research networks, cohesion and collaboration. 31 , 34 , 38 , 43 , 46 , 48 , 49 , 50 , 53 , 55 , 70 Second‐career PhD students who did not invest time in establishing relationships with researchers in their new disciplines (as they already had strong supportive networks in their original disciplines) found that they had limited research networks following graduation. 48

Summary of key findings

Our initial programme theory was based on previous literature reviews 1 , 4 , 5 , 6 , 7 and on the REF2014 criteria. 10 , 21 However, we were able to develop a modified programme theory on the basis of our realist synthesis, which highlights novel findings in terms of what really matters for successful research environments. Firstly, we found that key interventions led to both positive (subjective and objective) and negative (subjective and objective) outcomes in various contexts. Interestingly, we did not identify any outcomes relating to research impact despite impact nowadays being considered a prominent marker of research success, alongside quantitative metrics such as number of publications, grant income and h‐indices. 21 Secondly, we found that disciplinary contexts appeared to be less influential than individual, local and institutional contexts. Finally, our modified programme theory demonstrates a complex interplay among three cross‐cutting mechanisms (time, researcher identity and relationships) as mechanisms underpinning both successful and unsuccessful research environments.

Key findings and comparisons with the existing literature

Our research supports the findings of earlier reviews 1 , 5 , 6 , 7 regarding the importance of having a clear research strategy, an organisation that values research, research‐oriented leadership, access to resources (such as people, funding, research facilities and time), and meaningful relationships. However, our research extends these findings considerably by flagging up the indication that a clear linear relationship, whereby the presence of these interventions will necessarily result in a successful research environment, does not exist. For example, instituting a research strategy can have negative effects if the indicators are seen as overly narrow in focus or output‐oriented. 38 , 40 , 46 , 47 , 64 Similarly, project money can lead to the employment of more part‐time staff on fixed‐term contracts, which results in instability, turnover and lack of research team expertise. 40 , 67 , 71

Our findings indicate that the interplays among time, identity and relationships are important considerations when implementing interventions promoting research environments. Although time was identified as an important mechanism affecting research outcomes within the majority of papers, researcher identity positively affected research outcomes even in time‐poor situations. Indeed, we found that identity acted as a mechanism for research productivity that could overcome limited time through individuals efficiently finding time to prioritise research through their motivation and resilience. 35 , 43 Time was therefore more than just time spent doing research, but also included investment in developing a researcher identity and relationships with other researchers over time. 37 , 38 , 41 , 48 , 49 , 54 , 59 , 61 , 63 , 67 , 69 Relationship‐building interventions were also found to be effective in supporting difficult identity transitions into research faced by ECRs and those with first‐career practitioner backgrounds. Supportive leadership, as a particular form of relationship, could be seen as an enabler to the provision of protected time and a reasonable workload, allowing time for research and for researcher identity formation. 34 , 38 , 46 , 48 Indeed, our realist synthesis findings highlight the central importance of researcher identity and thus offer a novel explanation for why research environments may not flourish even in the presence of a research strategy, resources (e.g. time) and valuing of research.

Researcher identity is complex and intersects with other identities such as those of practitioner, teacher, leader and so on. Brew et al. 39 , 73 , 74 explored researcher identification and productivity by asking researchers if they considered themselves to be ‘research‐active’ and part of a research team. Those who identified as researchers prioritised their work differently: those who were highly productive prioritised research, whereas those in the low‐productivity group prioritised teaching. 73 Interestingly, highly productive researchers tended to view research as a social phenomenon with publications, presentations and grants being ‘traded’ in academic networks. Brew et al. 39 explain that: ‘…the trading view relates to a self‐generating researcher identity. Researcher identity develops in the act of publication, networks, collaborations and peer review. These activities support a person's identification as a researcher. They also, in turn, influence performance measures and metrics.’ Although the relationships among identity, identification and productivity are clearly complex, we explored a broader range of metrics in our realist synthesis than just productivity.

Methodological strengths and limitations

This is the first study to explore this important topic using realist synthesis to better understand the influence of context and how particular interventions lead to outcomes. We followed RAMESES 20 guidelines and adopted a rigorous team‐based approach to each analytic stage, conducting regular quality checks. The search was not exhaustive as we could have ‘exploded’ the interventions and performed a comprehensive review of each in its own right (e.g. mentoring). However, for pragmatic reasons and to answer our broad research questions, we chose not to do this, as suggested by Wong et al. 20 Although all members of the team had been involved in realist syntheses previously, the process remained messy as we dealt with complex phenomena. The messiness often lies in untangling CMOCs and identifying recurrent patterns in the large amounts of literature reviewed.

Implications for education and research

Our findings suggest that interventions related to research strategy, people, IIF and collaboration are supported under the ‘right’ conditions. We need to focus on time, identity and relationships (including leadership) in order to better mobilise the interventions to promote successful research environments.

Individuals need to reflect on how and why they identify as researchers, including their conceptions of research and their working towards the development of a researcher identity such that research is internally motivated rather than just externally driven. Those who are second‐career researchers or those with significant teaching or practitioner roles could seek to align research with their practice while they establish wider research networks.

We recommend that research leaders support individuals to develop their researcher identity, be seen to value research, recognise that research takes time, and provide access to opportunities promoting research capacity building, strong relationships and collaboration. Leaders, for example, may introduce interventions that promote researcher identities and build research relationships (e.g. collaborations, networking, mentoring, research groups etc.), paying attention to the ways in which competitive or collaborative cultures are fostered. Browne et al. 75 recently recommended discussions around four categories for promoting identity transition: reflection on self (values, experiences and expectations); consideration of the situation (circumstances, concerns); support (what is available and what is needed), and strategies (personal strategies to cope with change and thrive). With the professionalisation of medical education, 76 research units are increasingly likely to contain a mixture of first‐ and second‐career researchers, and our review suggests that discussions about conceptions of research and researcher identity would be valuable.

Finally, organisations need to value research and provide access to resources and research capacity‐building activities. Within the managerialist cultures of HEIs, compliance and counting have already become dominant discourses in terms of promotion and success. Policymakers should therefore consider ways in which HEIs recognise, incentivise and reward research in all its forms (including subjective and objective measures of quantity, quality and impact) to determine the full effects of their policies on research environments.

Future research would benefit from further exploration of the interplay among time, identities and relationships (including leadership) in different contexts using realist evaluation. 77 Specifically, as part of realist approaches, longitudinal audio‐diaries 78 could be employed to explore researcher identity transitions over time, particularly for first‐career practitioners transitioning into second‐career researchers.

Contributors

RA and CER were responsible for the conception of the synthesis. All authors contributed to the protocol development. RA and PESC carried out the database searches. All authors sifted for relevance and rigour, analysed the papers and contributed to the writing of the article. All authors approved the final manuscript for publication.

Conflicts of interest

Ethical approval.

not required.

Supporting information

Table S1. Definitions of key terms.

Table S6. Contexts, interventions, mechanisms and outcomes identified in individual studies.

Acknowledgements

we thank Andy Jackson, Learning and Teaching Librarian, University of Dundee, Dundee, UK, for his advice and help in developing our literature searches. We also thank Laura McDonald, Paul McLean and Eilidh Dear, who were medical students at the University of Dundee, for their help with database searches and with sifting papers for relevance and rigour. We would also like to thank Chau Khuong, Australian Regenerative Medicine Institute, Monash University, Melbourne, Victoria, Australia, for her work in designing Figs ​ Figs1 1 and ​ and3 3 .

Learning Environments Research Paper

Academic Writing Service

This sample education research paper on Learning Environments features: 6900 words (approx. 23 pages) and a bibliography with 18 sources. Browse other research paper examples for more inspiration. If you need a thorough research paper written according to all the academic standards, you can always turn to our experienced writers for help. This is how your paper can get an A! Feel free to contact our writing service for professional assistance. We offer high-quality assignments for reasonable rates.

Current perspectives on learning in classrooms make clear that students learn best when they are engaged in their learning and helped to develop rich conceptual understanding. These views on learning, often referred to as constructivist perspectives, propose that students actively and socially construct their knowledge. A challenge facing educators is how to create classrooms that support this learning.

Academic Writing, Editing, Proofreading, And Problem Solving Services

Get 10% off with 24start discount code.

Increasingly, educators have recognized the need to reconfigure classrooms as environments that encompass the complex individual and social processes necessary to promote understanding. For a learning environment to succeed, teachers need to change their traditional role of information delivery to effective scaffolding that supports students in integrating and applying ideas. In this type of learning environment, students also have new roles. They need to be more invested and responsible in their learning as they engage in authentic tasks, collaborate with classmates, and use technology for research and problem solving.

A number of K-12 programs have been developed that help teachers and students create ambitious learning environments. Examples include environments for elementary science and mathematics such as Rochel Gelman’s Preschool Pathways to Science, Douglas Clements’ Building Blocks mathematics environment for early elementary students, and Nancy Songer’s BioKIDS science environment for upper elementary classrooms. Secondary environments include Cognitive Tutors, a computer-based mathematics learning environment developed by researchers at Carnegie Mellon University; John Mergendoller and colleagues’ Problem-Based Economics; and Project-Based Science, developed by researchers at the University of Michigan. These environments for learning are carefully designed, theoretically framed, research-based programs that support all facets of the learning context. They represent ambitious pedagogy and strive for ambitious outcomes. Many include technology, such as computers or Web-based communication tools, either as a primary focus or as an important component. These environments for learning are changing the face of education.

In this research paper, we examine how learning environments are engineered to support ambitious teaching and learning. We begin by considering the role of learning theory in the design of learning environments. We then examine the methods used to create and study environments. Next, we introduce features of learning environments and describe selected learning environments according to these features to illustrate how similar features are instantiated differently across environments. We close by discussing challenges and future directions in the design of environments for learning.

Learning Theory and Learning Environments

In contrast to previous views that emphasized learning as a process of transferring information from teachers or texts to learners, new views emphasize that learners are active constructors of knowledge. Accordingly, learning occurs through a constructive process in which students modify and refine what they know as they explore and try to make sense of the world around them. Students possess prior knowledge that they use to interpret learning experiences and construct new knowledge.

There are various formulations of constructivism and they explain different aspects of learning. Under this broad constructivist umbrella are two major perspectives on human learning. The first is cognitive in nature and focuses on individual thinking and learning. The second is social in nature and focuses on social interaction and the role of interactions within social contexts. Both perspectives are central in informing the design of new environments.

Cognitive Perspective

The cognitive perspective emphasizes the role of the individual in learning and is concerned with how complex information is handled mentally by learners, including how learners remember information, relate new information to prior knowledge to build schemas or knowledge networks that organize ideas, and develop understanding. Research on cognition suggests that prior knowledge and its organization plays a considerable role in learning and performance. For example, cognitive research provides insight into the skills and knowledge that underlie expert and novice performance. This research indicates that experts and novices differ in the amount and organization of knowledge and in their ability to apply knowledge to solve problems, compre-hend text, and respond to situations. Simply put, experts and novices differ in their cognitive resources, especially strategies for learning and performing tasks. Experts from all disciplines draw from a richly structured information base and are more likely to recognize meaningful patterns of information when problem solving. Experts know their disciplines thoroughly and their understanding of subject matter allows them to see patterns, identify relevant information, and notice inconsistencies or discrepancies that are not apparent to novices.

Research on how people learn and acquire expertise in various disciplines has given rise to notions about how to help students learn specific subject area content. Most students are novices in the content areas of reading, writing, mathematics, science, and social studies. They think differently from adult experts and draw from an information base that is often comprised of informal ideas that they have acquired through their everyday experiences. When working on mathematics problems, for instance, students will often apply thinking and reasoning strategies that are qualitatively different from what mathematics educators expect. This is because students draw from a limited set of cognitive resources to make sense of school mathematics. They are unfamiliar with the practices of mathematicians and the strategies that expert mathematicians use to solve problems and generate knowledge.

Many new environments for learning are designed to help students develop competence in content areas. Informed by studies on how novices think and what misconceptions they have, these environments strive to move students toward sophisticated ways of understanding that are characteristic of experts. Some of these environments provide tutoring and guidance in the use of strategies and thinking processes typical of experts. An example of this type of environment is Jack Mostow and colleagues’ computer-based reading environment, Literacy Innovation that Speech Technology Enables (LISTEN). LISTEN uses intelligent instructional software that provides specific hierarchical tasks and assistance to help elementary school students develop reading competence. In LISTEN, reading is guided and supported in one-on-one interactions between the individual student and computer. The computer acts as an expert tutor. It displays stories and uses speech recognition software to listen to children read aloud. Students wear headsets with microphones attached as they read aloud stories matched to their estimated reading levels. LISTEN software assigns stories to students based on their individual performance, monitors students’ reading, and provides feedback and hints. The software is based upon a careful analysis of reading skills and modeled after expert reading teachers.

Cognitive theory informs how to help students accomplish tasks and engage in specific thinking processes. The cognitive perspective has its roots in the work of Swiss psychologist Jean Piaget who focused on the mental structures underlying knowledge; he studied how children advance from novice to expert ways of thinking by constructing increasingly sophisticated knowledge. The body of research on human cognition has provided critical insight into how students think and reason about how the world works, how experts acquire their expertise, and the cognitive demands of thinking and problem solving in a range of situations and domains.

Social Perspective

The social perspective is concerned with how learning is shaped by participation in activity and interactions within social contexts. This view draws from the work of Russian psychologist Lev Vygotsky who argued that all learning originates in, and is a product of, the settings in which learners navigate. This means that knowledge is contextualized, and learners build knowledge and deepen their understanding through observations and interactions with the physical world as well as discourse and participation in activities with others. In this sense, learning is regarded as inherently social and the setting for learning— consisting of materials, activities, learners, and social interactions among learners and teachers—shapes the knowledge that is produced.

Vygotsky’s ideas are prominent in the work of designers of learning environments. One of his most influential ideas is that meaningful learning occurs when learners are engaged in rich social activity in which they communicate,  collaborate, and form a learning community. In a classroom learning community, teachers and students engage collectively in learning that produces shared understandings. Such a community consists of people collaborating on problems or projects, relying on one another for assistance when needed, and sharing, discussing, and debating ideas. Based on just such a notion of community, Ann Brown and Joseph Campione developed Fostering Communities of Learners (FCL), an environment for grades one through eight that fosters learning by developing group knowledge about a topic. In FCL, students and teacher select a topic of interest and then break into research groups that focus on relevant aspects of the topic. Research groups pursue different but related questions and then explain their work to the other groups. Collectively, the groups then synthesize the information to form a comprehensive understanding of the topic. To showcase their new understandings, students produce group reports, poster displays, and presentations or demonstrations. A feature of FCL is the jigsaw group, a learning group that contains a member representing each initial research group. In this new group, each jigsaw member is responsible for teaching their research information to everyone else. Thus, each member represents a piece of the puzzle that provides important and different knowledge for understanding the topic.

An important idea that has emerged from the work on the role of community in learning is the notion of communities of practice. Introduced by Jean Lave and Etienne Wenger (1991), a community of practice describes a situation in which robust knowledge and understandings are socially constructed though talk, activity, and interaction in an authentic, real-world context. In classrooms, this means having students communicate and engage in activities that reflect the discipline under study. For instance, establishing a community of scientific practice requires that learners participate in activities similar to that of scientists. This entails engaging in scientific inquiry much like scientists do, but in ways that are appropriate and meaningful for students. Such an approach actively involves students in scientific practices such as conducting investigations, making observations, gathering and analyzing data, and reporting findings. A benefit of situating science learning in an authentic context is that it provides a meaningful and motivating backdrop for introducing students to the conventional language, practices, tools, and values of the scientific community.

Another influential idea from Vygotsky is the critical importance of supporting learners to accomplish tasks that they otherwise cannot accomplish on their own. Vygotsky coined the phrase zone of proximal development to represent the capacity that learners have to perform tasks with the help of others that they would not be able to perform on their own. Helping learners advance by moving through a zone of proximal development requires support. Theorists refer to this specialized support as scaffolding. Instructional scaffolding is the support provided to a learner or groups of learners by a more knowledgeable person, such as a teacher, to help advance learning. In recent years, the notion of scaffolding has been expanded to include learning technologies, such as computer software, that help learners participate in activities that are just beyond their abilities. Scaffolding can help structure a learning task, guide learners through a task, and support thinking, planning, and performance. In a classroom community of practice, scaffolding is essential for aiding learners in developing disciplinary skills and knowledge.

An example of an environment that incorporates extensive scaffolding in a community of practice is Guided Inquiry supporting Multiple Literacies (GIsML), developed by Anne Marie Palincsar and Shirley Magnusson. GIsML is a science environment for elementary classrooms that promotes student learning through guided inquiry. The term guided refers to the teacher’s role in scaffolding the development of students’ science knowledge and reasoning as they proceed through cycles of inquiry. In GIsML, students work in pairs or small groups to conduct hands-on investigations in which they collect and analyze data and then report findings to the class. The role of the teacher is to support student thinking through key issues of investigation, such as specific questions to drive the investigation and the design of methods. When it is time to report findings, the teacher assists and guides students in making claims and supporting those claims with evidence (Palincsar & Magnusson, 2001). This process engages students in using the tools, language, and ways of reasoning that are characteristic of scientific inquiry.

From the social perspective, learning depends on the experience and knowledge of students, the knowledge and skills of the teacher, the design of tasks, the tools that are available, and the community that is developed. Furthermore, these factors are interdependent; a change in one will influence the effect of others on learning. Environments developed along social perspective lines include tasks that are typical of those used in the disciplines, instructional scaffolding by more knowledgeable others, tool use that supports learning, and development of learning communities that engage students in practices representative of the subject under study.

Relevance of Theory for Creating Environments

Theory serves as a blueprint that helps designers envision the instructional landscape for building environments. By starting with theory, designers are able to anticipate how to support and organize learning. This anticipatory thinking is important for articulating the how and why of instruction. The how is what teachers and students will do during instruction or, more broadly, how the environment will work to promote learning. When designers start with a theoretical framework, they can envision the kinds of classroom activities and interactions necessary to motivate and sustain learning. The why provides explanatory power or insight into why designed instructional activities and interactions are productive for learning. Theory explains why particular instructional experiences, when structured and enacted in particular ways, are more or less likely to result in learning. For example, social constructivist theory provides direct guidance for how to support students as they engage in inquiry through techniques that scaffold thinking, planning, and performance in the inquiry process. Scaffolding is important because it guides learners through a task, reducing confusion and increasing the likelihood that students will attend to the important ideas.

Learning environments based on constructivist theories often incorporate components of both cognitive and social views. Increasingly, designers recognize that both aspects need to be addressed when developing learning environments. The Cognitive Tutors environment developed by John Anderson and colleagues at Carnegie Mellon University is a computer-based learning environment for secondary school mathematics classrooms. Cognitive Tutors is based on a cognitive theory of learning and performance that describes how individual learners acquire and learn to use mathematical knowledge. Cognitive Tutors software provides tutoring by presenting problem-solving situations to students individually, monitoring students’ solution steps, and providing feedback. The environment is centrally focused on individual cognition, but the designers also attend to the social context in which Cognitive Tutors is used. They consider how individual tutor use can be integrated with classroom instruction and collaborative problem-solving experiences. By joining cognitive and social perspectives, they are able to make Cognitive Tutors comprehensive and usable for teachers and students.

Designing and Studying Learning Environments

To engineer learning environments, designers need to have a deep understanding about the reality of classrooms and schools. They need expertise in how people learn and the conditions that give rise to learning. They also need to understand academic standards and disciplinary content, the types of tasks and instructional materials that can help students attain learning goals, the role of the teacher in orchestrating instruction, and how to assess learning. Many learning environments feature technology as a tool for learning. To effectively design for learning with technology, designers need to grasp the benefits and challenges of using technologies in classrooms. Finally, designers need to work well with others because the building of learning environments is often too complex for any one designer.

Designers typically work in interdisciplinary teams, with members drawn from education, psychology, computer science, and cognitive sciences. These professionals are centrally concerned with how people learn and how to configure environments to optimally support learning. Many are part of an emerging interdisciplinary field known as the learning sciences, which is comprised of researchers who study teaching and learning in a variety of settings, including school classrooms. The goal of the learning sciences is to better understand the cognitive and social processes that promote learning and to use this knowledge to create learning environments that help teachers teach more effectively and students learn more deeply. A hallmark of the learning sciences is collaboration among researchers with diverse professional backgrounds and ways of thinking to envision and design the schools and classrooms of the future.

Design teams often include disciplinary specialists from mathematics, the sciences, or the social sciences as well as teachers who can help designers think through how materials can be enacted and sustained in real-world classrooms. Because all aspects of an environment are designed, bringing together experts with a range of skills and perspectives is essential. A team working on the design of a middle school mathematics learning environment, for instance, might include an educational psychologist who understands how children think and reason mathematically and the kinds of instructional practices that can best support mathematics learning, a mathematician familiar with the mathematics content and standards, a literacy specialist who can guide the development of text materials for math learning, an expert on educational technology who can design computer tools to support learning, and a mathematics teacher who can provide insight into how teachers might best be supported in enacting the environment. Another key person is a mathematics education researcher who can study how the newly designed environment is enacted by teachers and students in classrooms and the effect of the environment on math learning.

The Design Cycle

The designing and building of learning environments is an iterative process that proceeds through cycles of design, implementation, and evaluation. The first step in the cycle, design, is the process of going from a set of ideas derived from theory to a usable, workable, instructional road map for enacting the environment. For example, designers of the learning environment Project-Based Science (PBS) draw from social constructivist theory to inform the design of the environment. The key social constructivist ideas include active construction, situated learning, and social interaction, among others. These theoretical ideas serve as design principles that provide insight into the kinds of classroom activities necessary to support learning. The first principle, active construction, emphasizes the importance of having students actively construct their knowledge by participating in real-world science activities. The second principle, situated learning, suggests that students need to work in an authentic context that mirrors practices in the scientific community. This means providing a context in which students can use and explore scientific practices and apply scientific ideas so they can more readily see the relevance of their participation in activities. Social interaction, the third principle, addresses the need for students to work with one another to conduct investigations and to discuss their ideas and findings. Collaboration helps students build shared understandings of scientific ideas and the nature of science as they engage in discourse with their classmates and teacher. These principles define clearly specified activities within the PBS environment. Design principles are pathways from theory to practice; they are starting points for envisioning how a learning environment might come together.

Early in the design cycle, designers face uncertainty because the initial design of an environment is really only a tentative plan. Instructional sequences and activities are sketched; instructional materials such as a teaching guide and student workbooks are drafted; and technology tools, if integral to the environment, are initially developed. At this point, the hard work is just beginning as designers then move their design work to the complex classroom setting.

In implementation, the second step, the newly designed environment is enacted in a small number of classrooms. Implementation is where teachers and students bring an environment to life and often times, this is where tensions between the intended and enacted environment emerge. The intended environment is the ideal environment envisioned by designers; the enacted environment is what the environment actually looks like and how it works in the hands and minds of teachers and students in everyday classrooms. Implementation is a moment of truth for designers as they get a first glimpse into the practical realities of trying to create an effective environment for learning. Perhaps not surprisingly, issues often arise in early implementation efforts. This is where the third step, evaluation, comes into play.

Evaluation involves the careful study of the implementation. This is a critical step. By studying how an environment is enacted, designers get comprehensive accounts of what works in practice and what does not. This information is used for revising features that do not work as anticipated. For example, in early implementations of PBS, designers found that students were unaccustomed to working in groups and discussing ideas. Students were not used to learning from their own inquiries and had difficulty engaging in productive discussions with one another and the teacher. These issues required PBS designers to rethink how students could be more effectively supported in collaborative inquiry. PBS designers added teacher supports to the instructional materials that included a range of teaching strategies for fostering scientific inquiry and examples of questions and prompts to support discourse. Scaffolds from teachers, peers, and technology were incorporated to guide students through learning activities. Once these and other modifications were made, the PBS environment was implemented and studied again, using data collected from the implementation to inform the next round of design and implementation. After several cycles of design, implementation, and evaluation, the result was a research-based innovation that had been extensively field-tested for usability and effect on student learning and motivation.

The approach to designing environments described here is often referred to as design-based research. Design-based research can be traced back to the work of Ann Brown who was one of the first education researchers to promote the idea of designing, enacting, and studying innovations such as learning environments within everyday school settings. According to Brown, researchers can gain important insights about the conditions of learning by bringing theory into practical educational contexts (Brown, 1992). Design-based research enables designers to test whether their theoretical assumptions are usable in the real world. As designers follow a design-based research cycle, they generate knowledge that applies to classroom practice and leads to stronger connections between theory and practice. This work also contributes to a richer understanding of the guiding learning theories. Well done design-based research, then, is likely to yield theoretical and practical insights necessary to advance knowledge by informing theory, design principles, or practice recommendations.

Scaling Learning Environments

Once learning environments have been field-tested in a small number of settings, the challenge is to test their usability in a wide variety of circumstances. To scale up means to take a learning environment that is successful in a few settings and expand the implementation beyond those classrooms and teachers who participated in the design-based research cycle. This involves modifying a learning environment for widespread use with many teachers and students throughout a district, or across districts in a state, or in multiple schools and districts around the country.

When designers prepare to go to scale, their focus shifts from considering an environment’s implementation within one or several classrooms to implementation in the larger context of schools and school systems. This kind of design work is substantially different and requires designers to attend to professional development for teachers, consider school and district resources for implementation, redesign technology to work within the existing infrastructure of schools, and modify materials and activities for effective use by a wide range of teachers and students. A central goal is to modify the environment so that it is usable and sustainable given school realities, yet provides opportunities and supports for teachers and students to enact innovative and ambitious instruction. To achieve this goal, designers supply teachers with highly specified and developed materials that are critical for ensuring success. Highly specified and developed materials provide teachers with a model of how to enact an environment as well as resources and strategies to promote learning. Educational researchers Deborah Ball and David Cohen coined the term specified and developed to emphasize that materials should specify a clear theoretical stance, learning goals, intended teaching practices, and guidelines for enactment (Ball & Cohen, 1996). Furthermore, materials should include resources for teachers and students to use, such as student workbooks and readers, assessments, teacher manuals, and professional development materials that provide examples of ways to enact an environment.

Scaling up also entails modifying environments so that they align with important learning goals found in district, state, and national standards. This is another essential way that designers make their environments usable and sustainable for schools and districts. School administrators and teachers are unlikely to adopt an innovative environment if it does not emphasize content and instructional practice recommendations made in state and national standards documents. Additionally, designers are increasingly finding that they need to show that their environments improve learning as measured by high-stakes tests. For this reason, many discipline-based learning environments emphasize recommendations made in state and national standards documents and include assessments that are aligned with important learning goals.

Features of Environments for Learning

Building a comprehensive learning environment requires that designers attend to the major features of the learning setting. Features are basic aspects of learning environments that influence student learning and performance. Major features include goals, tasks, instructional materials, social organization, teacher, technology, and assessment.

The goals of learning environments can be academic, social, metacognitive, or developmental. A learning environment might encompass all four types of goals or only one or two. Academic goals focus on disciplinary content and often include learning about disciplinary practices and norms. Goals can also be social, such as the interpersonal goal of learning to work cooperatively, the motivational goal of improving attitudes and promoting interest, and the communicative goal of learning the discourse modes of different disciplines. A third type of goal, metacognitive, promotes a disposition for thinking and reasoning. These are mental habits such as persistence and posing questions that support self-directed or self-regulated learning. Developmental goals tend to focus on moving students forward in terms of knowledge and expertise.

Tasks are specific activities that students perform to learn academic content and skills. Learning environments usually emphasize authentic tasks that reflect the work of experts and require students to use their knowledge and skills in real-world situations. Authenticity in a social studies learning environment, for instance, might mean engaging students in tasks that are similar to what historians do in ways that are appropriate and meaningful for students. This might entail students researching an historical topic of interest and then presenting information in a historically correct manner, using terms and making arguments and explanations as historians would.

Instructional Materials

Learning environments often provide materials to support and guide enactment. Teachers have a unique position in enactment because their use of materials shapes the potential of the environment for enhancing student learning. Increasingly, designers recognize teachers’ important role and develop materials that are designed to be educative for teachers. The materials provide targeted assistance to teachers to support their learning so that they, in turn, can better support student learning. Environments that provide educative materials might include a comprehensive guide for enactment with rationales for activities; background on content; guidance in how to use materials and technology with students; examples of questions and prompts to support discourse; and teaching strategies for fostering inquiry, scaffolding student learning, or for building communities of practice. When materials are not extensively developed and teachers play a central role in enacting the learning environment, teachers need to have considerable expertise to meet the goals set by designers.

Social Organization

Many learning environments require that students actively engage in learning, communicate their ideas, and learn from one another. These environments promote a social context that allows students to feel comfortable asking questions, seeking help, and responding to questions. Students collaborate and communicate around authentic tasks and investigations, and they participate in a community of practice that mirrors the discipline under study. Some learning environments are linked to communities that extend beyond the classroom. They might support community participation by encouraging students to pre-sent findings to audiences in the community, such as local interest groups and students in other classes. Students’ community reach might be further extended to other schools and communities by publishing on the Internet. Technology-based learning environments, for instance, often connect students with other students across school sites to collect, share, and interpret data.

The success of any learning environment depends on the teacher, even though the role of the teacher can vary considerably across environments. In some learning environments, teachers play a central role. Their instructional efforts to scaffold student learning, orchestrate group problem solving and investigation, facilitate discussions, and assess understanding are critical. In others, the teacher’s position in enacting the learning environment is less central to helping students meet goals. This is the case for computer-based tutoring environments where the technology provides a high level of guidance and is the major influence on student learning. A challenge for teachers in enacting any type of environment is ensuring that the way the environment is enacted matches the theoretical stance of the environment. This requires that teachers have a firm understanding of the learning theory and goals underlying an environment.

Learning environments use technologies for many purposes. Students in many innovative learning environments are actively involved in using technology tools, such as Internet search engines for research, e-mail or instant messaging for communicating with peers and experts, and visualization or simulation software to create and study models. Some designers place technology at the core of their learning environments and provide custom-designed software tools to support students’ knowledge building and knowledge integration. Some of these environments are designed to foster collaborative inquiry. Other computer-based environments are designed for individual work. Still other learning environments may use technologies although they are not central to the enactment of the environment. Increasingly, teachers need technology expertise because they are primarily responsible for troubleshooting and using technology tools to leverage learning.

Most learning environments include or recommend several types of assessments to evaluate students’ learning. Individual and group portfolios, student reports and presentations, and traditional tests are characteristic of many environments. In some, students design and build artifacts that showcase their learning. Assessments in discipline-based learning environments typically target content knowledge, reasoning skills, and students’ understanding of the nature of the discipline. Some environments might also assess students’ motivation, including interest, feelings of efficacy, and goals for learning.

Although many learning environments share most, if not all, of the features described above, they differ in the emphasis they place on particular features and how they instantiate them. For instance, technology is the core in some environments; in others it is secondary or not present at all. Another difference is that content might be addressed through problem-based tasks representative of a particular discipline in some environments, and project-based tasks may be used in others. Learning environments also encompass different types of social organization and differ in their overarching goals. Features of any particular environment are presumed to work together to foster learning in an educational setting. This is because the goals and instantiations of each feature of a learning environment derive from the same underlying theoretical ideas about learning.

Examples of Environments for Learning

This section presents summaries of two learning environments designed to support ambitious teaching and learning. Each environment has a strong theoretical foundation: one environment is grounded in the social perspective, the other in the cognitive perspective. Each offers a different way of bringing theory into design and practice. Brief descriptions are included of goals, the types of tasks and instructional materials used to reach the goals, social organization of the environments, the role of the teacher, the use of technology, and how learning is assessed.

Project-Based Science

In a Project-Based Science (PBS) learning environment, middle school students engage in real-world investigations in ways that are similar to how scientists conduct inquiry. Developed by Joseph Krajcik, Ronald Marx, Phyllis Blumenfeld, Elliot Soloway, and others at the University of Michigan, PBS promotes instruction and learning through carefully designed and developed inquiry projects. Projects are framed around a driving question that guides instruction and serves to organize students’ investigations. Driving questions are crafted to encompass science content and to connect with students’ interests and curiosities about the world. For instance, students learn about microbiology and infectious diseases by engaging in inquiry tasks framed around the question, How can good friends make me sick? In this project, students explore how a communicable disease spreads through a community. A central goal of the PBS environment is to have students engage in extended inquiry to understand science content and practices that are outlined in state and national standards. Another important goal is to contribute to students’ attitudes toward science.

The theoretical foundation of PBS draws from a social constructivist perspective that emphasizes active, situated, and collaborative learning. In PBS classrooms, students are provided with a meaningful context in which to explore the driving question over a sustained period of time. For example, the driving question, Why do I need to wear a helmet when I ride my bike? situates the science topic of force and motion in an issue that is likely to be of interest to students. As students become involved in the project, they collaborate with peers and with their teacher to ask and explore smaller questions that contribute to understanding the driving question. They conduct investigations, weigh evidence, write explanations, and discuss and present findings. As they pursue answers to the driving question, they participate in situated activities that help them learn scientific content and practices relevant and necessary to construct a meaningful response.

Students in PBS classrooms produce artifacts, or products, that showcase their learning. For example, students create models that represent scientific phenomena, develop concept maps that illustrate their understanding of complex ideas, and prepare presentations and reports that explain their findings and the evidence for those findings. Teachers often use students’ artifacts for assessment purposes in combination with traditional tests. Additionally, surveys are used to explore students’ perceptions of the learning environment and its influence on attitudes toward science and motivation to learn science.

PBS uses technology tools and resources such as Web-based databases, model-building software, handheld technologies, and the Internet for interactive inquiry. Computers and other technologies extend students’ thinking by providing access to information and opportunities to communicate, explore phenomena, and build scientifically accurate models that represent phenomena.

Teachers play a central role in PBS classrooms by orchestrating instruction so that students develop the important skills and stance necessary for engaging in inquiry. They provide instructional scaffolds that help students engage in productive discussions with one another and their teacher, plan and carry out investigations, analyze data, and present findings. Highly specified and developed teacher materials that help create and sustain a PBS environment include detailed lesson descriptions and supports that clearly identify learning goals; examples of students’ likely ideas; questions and tasks for guiding and monitoring student understanding; instructional strategies to support students as they engage in inquiry; and key ideas that teachers can emphasize in helping students make sense of their inquiry experiences.

Cognitive Tutors

Technology is the centerpiece of the Cognitive Tutors learning environment developed by John Anderson, Albert Corbett, Kenneth Koedinger, and others at Carnegie Mellon University. Cognitive Tutors is a computer-based environment for high school mathematics classrooms that uses intelligent instructional software to teach students such topics as algebra, geometry, and integrated math. Cognitive Tutors software provides one-to-one tutoring by presenting problem-solving situations to students individually and monitoring and guiding students as they work through the tasks. The central goal of the environment is to raise mathematics achievement by developing students’ math problem-solving abilities and deepening their math knowledge.

Cognitive Tutors is based on a cognitive theory of learning and performance that proposes students learn best by doing rather than watching or listening. The software presents real-world problem-solving situations that require students to apply math knowledge and practice specific math skills. For instance, the algebra tutor emphasizes algebraic reasoning through such problems as comparing car rental options, engineering a highway, and organizing to make and sell T-shirts. As students engage in the problem-solving tasks, they also become adept at using and interpreting mathematical representations such as tables, graphs, and symbolic expressions.

Cognitive Tutors software monitors students’ problemsolving performance by following students as they work through a task and providing feedback. Each Cognitive Tutor employs a cognitive model that represents the skills and strategies required to complete each task. When a student makes an error, the tutor initially displays an error message and provides on-demand hints; multiple hints are available for each step of a problem to ensure that the correct path to a solution is followed. An error message serves as a prompt that allows a student to correct errors without assistance. Multiple hints allow the student to succeed with minimum assistance. The tutor provides tailored practice on math skills until students reach mastery performance levels. Once a student reaches mastery on a particular math skill, the tutor stops presenting new problems for that skill.

The Cognitive Tutors environment integrates individual tutor use with classroom instruction that includes collaborative problem-solving activities and class discussions. The teacher facilitates classroom instruction and circulates and assists students as they interact with the tutor. The teacher needs to have subject matter knowledge, be familiar with tutor software (including how to troubleshoot technical issues), and be comfortable facilitating collaborative problem solving.

Instructional materials that help teachers enact the environment accompany the software. The materials include a problem-based textbook for students and a teacher’s guide that consists of assignments, assessments, teaching suggestions, and classroom management techniques. The purpose of the textbook and classroom activities is to extend the development of concepts emphasized in the software.

Assessment is an integral part of the Cognitive Tutors environment. Cognitive Tutors software includes step-by-step assessments of students’ mathematical skills and provides a skill report to identify math skill levels and progress for each student. Assessments provided in the teacher’s materials include exams, quizzes, and rubrics for grading class presentations. Teachers are also encouraged to create and share their own assessments in an online teacher community.

Challenges and Future Directions

Designing an environment for learning requires significant time, effort, and resources. A design-based research team will typically work for three or more years to develop, modify, and refine a learning environment. Designers work closely with teachers in classrooms to observe how an enactment unfolds and to study how the environment enhances learning. The iterative approach enables a design team to modify their environment in a real-world setting, carefully observe as teachers introduce the refinements, and then make further adjustments as needed. In fact, designers may follow teachers’ classes over several years to gain insight and guidance into optimally supported learning.

This approach requires long-term partnerships between educational researchers, school administrators, and teachers. It requires that schools change their culture and routines to support innovative practice and that designers find creative ways to help teachers reconfigure their classrooms as environments for ambitious learning. It is clear that contemporary learning environments represent a considerable departure from the type of classroom experience with which most teachers, students, administrators, and parents are familiar. Learning environments, then, pose special challenges that require considerable knowledge, skill, and foresight to address.

Future Directions

The present work on learning environments marks efforts to design the schools of the future. These first generation environments provide a glimpse of the potential of this work for transforming classrooms. Students in these environments have been introduced to new ways of learning. They solve meaningful problems, collaborate with others, use cutting-edge technology, and create artifacts that showcase their learning. They gain important knowledge and skills, appreciation for disciplinary practices, and new dispositions for learning. For designers, these environments provide a rich context for interdisciplinary research. As this work continues, we will gain a better understanding of how instructional interactions shape learning and how effective environments can be designed.

A necessary next step is to design environments for a wider range of school contexts. This is important for scaling up. Schools are becoming more linguistically and culturally diverse every year, and carefully designed environments that support students from different backgrounds is essential. Similarly, designers need to examine how learning environments can be created or adapted for inclusiveness of special needs learners. There is evidence that learning environments can help address diversity because they offer a variety of instructional techniques, activities, technology tools, and different ways for students to participate.

New knowledge about how people learn has enriched understanding of how to create successful conditions for learning. Learning environments represent an expanded view of teaching and learning that encompasses the social context and recognizes the complexity of instruction. Over the next decade, new learning environments will emerge that may prove critical for preparing students for the 21st century. This research paper examined the building of learning environments in schools and classrooms. It is important to note that many others are being designed for informal learning settings such as museums, science centers, and afterschool programs.

Learning environments represent ambitious pedagogy and strive for ambitious outcomes. The designers of these environments are committed to transforming schools and classrooms into dynamic places where teachers teach more effectively and students learn more deeply. Design-based research is a new approach that strengthens the bridge between learning theory and educational practice and advances our understanding of both. Design work is challenging and complex, requiring collaborations among educational researchers, teachers, disciplinary experts, school leaders, and others. The work is vital for improving schools to meet the needs of our rapidly changing knowledge-based and technological society. The schools of the future are on the horizon, and the interdisciplinary efforts of designers promise to create innovative environments that will serve as a foundation for the next generation of schools.

Bibliography :

  • Ball, D. L., & Cohen, D. K. (1996). Reform by the book: What is—or might be—the role of curriculum materials in teacher learning and instructional reform? Educational Researcher, 25(9), 6-8.
  • Blumenfeld, P., Marx, R. W., & Harris, C. J. (2006). Learning environments. In W. Damon, R. M. Lerner, K. A. Renninger, & I. E. Sigel (Eds.) Handbook of child psychology, 6th edition, volume 4: Child psychology in practice (pp. 297-342).
  • Hoboken, NJ: Wiley. Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). How people learn: Brain, mind, experience, and school (Expanded Ed.). Washington, DC: National Academies Press.
  • Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2, 141-178.
  • Brown, A. L., & Campione, J. C. (1994). Guided discovery in a community of learners. In K. McGilly (Ed.), Classroom lessons: Integrating cognitive theory and classroom practice (pp. 229-270). Cambridge, MA: MIT Press.
  • Corbett, A. T., Koedinger, K. R., & Hadley, W. S. (2001). Cognitive Tutors: From the research classroom to all classrooms. In P. Goodman (Ed.), Technology enhanced learning: Opportunities for change (pp. 235-263). Mahwah, NJ: Lawrence Erlbaum Associates.
  • Davis, E. A., & Krajcik, J. S. (2005). Designing educative curriculum materials to promote teacher learning. Educational Researcher, 34(3), 3-14.
  • Design-Based Research  (2003). Design-based research: An emerging paradigm for educational inquiry.
  • Educational Researcher, 32(1), 5-8. Gelman, R., & Brenneman, K. (2004). Science learning pathways for young children. Early Childhood Research Quarterly, 19, 150-158.
  • Krajcik, J., & Blumenfeld, P. (2006). Project-based learning. In R. K. Saywer (Ed.), Cambridge handbook of the learning sciences (pp. 317-334). New York: Cambridge University Press.
  • Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York: Cambridge University Press.
  • Maxwell, N., Bellisimo, Y., & Mergendoller, J. (2001). Problem-based learning: Modifying the medical school model for teaching high school economics. Social Studies, 92, 73-78.
  • Palincsar, A. S., & Magnusson, S. J. (2001). The interplay of first-hand and second-hand investigations to model and support the development of scientific knowledge and reasoning. In S. Carver & D. Klahr (Eds.), Cognition and instruction: Twenty-five years of progress (pp. 151-193). Mahwah, NJ: Lawrence Erlbaum Associates.
  • Sarama, J., & Clements, D. H. (2004). Building blocks for early childhood mathematics. Early Childhood Research Quarterly, 19, 181-189.
  • Saywer, R. K. (Ed.). (2006). Cambridge handbook of the learning sciences. New York: Cambridge University Press.
  • Singer, J., Marx, R. W., Krajcik, J., & Chambers, J. C. (2000). Constructing extended inquiry projects: Curriculum materials for science education reform. Educational Psychologist, 35(3), 165-178.
  • Songer, N. B. (2006). BioKIDS: An animated conversation on the development of curricular activity structures for inquiry science. In R. K. Sawyer (Ed.), Cambridge handbook of the learning sciences (pp. 355-369). New York: Cambridge University Press.
  • Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.

ORDER HIGH QUALITY CUSTOM PAPER

learning environment research papers

  • MyU : For Students, Faculty, and Staff

Fall 2024 CSCI Special Topics Courses

Cloud computing.

Meeting Time: 09:45 AM‑11:00 AM TTh  Instructor: Ali Anwar Course Description: Cloud computing serves many large-scale applications ranging from search engines like Google to social networking websites like Facebook to online stores like Amazon. More recently, cloud computing has emerged as an essential technology to enable emerging fields such as Artificial Intelligence (AI), the Internet of Things (IoT), and Machine Learning. The exponential growth of data availability and demands for security and speed has made the cloud computing paradigm necessary for reliable, financially economical, and scalable computation. The dynamicity and flexibility of Cloud computing have opened up many new forms of deploying applications on infrastructure that cloud service providers offer, such as renting of computation resources and serverless computing.    This course will cover the fundamentals of cloud services management and cloud software development, including but not limited to design patterns, application programming interfaces, and underlying middleware technologies. More specifically, we will cover the topics of cloud computing service models, data centers resource management, task scheduling, resource virtualization, SLAs, cloud security, software defined networks and storage, cloud storage, and programming models. We will also discuss data center design and management strategies, which enable the economic and technological benefits of cloud computing. Lastly, we will study cloud storage concepts like data distribution, durability, consistency, and redundancy. Registration Prerequisites: CS upper div, CompE upper div., EE upper div., EE grad, ITI upper div., Univ. honors student, or dept. permission; no cr for grads in CSci. Complete the following Google form to request a permission number from the instructor ( https://forms.gle/6BvbUwEkBK41tPJ17 ).

CSCI 5980/8980 

Machine learning for healthcare: concepts and applications.

Meeting Time: 11:15 AM‑12:30 PM TTh  Instructor: Yogatheesan Varatharajah Course Description: Machine Learning is transforming healthcare. This course will introduce students to a range of healthcare problems that can be tackled using machine learning, different health data modalities, relevant machine learning paradigms, and the unique challenges presented by healthcare applications. Applications we will cover include risk stratification, disease progression modeling, precision medicine, diagnosis, prognosis, subtype discovery, and improving clinical workflows. We will also cover research topics such as explainability, causality, trust, robustness, and fairness.

Registration Prerequisites: CSCI 5521 or equivalent. Complete the following Google form to request a permission number from the instructor ( https://forms.gle/z8X9pVZfCWMpQQ6o6  ).

Visualization with AI

Meeting Time: 04:00 PM‑05:15 PM TTh  Instructor: Qianwen Wang Course Description: This course aims to investigate how visualization techniques and AI technologies work together to enhance understanding, insights, or outcomes.

This is a seminar style course consisting of lectures, paper presentation, and interactive discussion of the selected papers. Students will also work on a group project where they propose a research idea, survey related studies, and present initial results.

This course will cover the application of visualization to better understand AI models and data, and the use of AI to improve visualization processes. Readings for the course cover papers from the top venues of AI, Visualization, and HCI, topics including AI explainability, reliability, and Human-AI collaboration.    This course is designed for PhD students, Masters students, and advanced undergraduates who want to dig into research.

Registration Prerequisites: Complete the following Google form to request a permission number from the instructor ( https://forms.gle/YTF5EZFUbQRJhHBYA  ). Although the class is primarily intended for PhD students, motivated juniors/seniors and MS students who are interested in this topic are welcome to apply, ensuring they detail their qualifications for the course.

Visualizations for Intelligent AR Systems

Meeting Time: 04:00 PM‑05:15 PM MW  Instructor: Zhu-Tian Chen Course Description: This course aims to explore the role of Data Visualization as a pivotal interface for enhancing human-data and human-AI interactions within Augmented Reality (AR) systems, thereby transforming a broad spectrum of activities in both professional and daily contexts. Structured as a seminar, the course consists of two main components: the theoretical and conceptual foundations delivered through lectures, paper readings, and discussions; and the hands-on experience gained through small assignments and group projects. This class is designed to be highly interactive, and AR devices will be provided to facilitate hands-on learning.    Participants will have the opportunity to experience AR systems, develop cutting-edge AR interfaces, explore AI integration, and apply human-centric design principles. The course is designed to advance students' technical skills in AR and AI, as well as their understanding of how these technologies can be leveraged to enrich human experiences across various domains. Students will be encouraged to create innovative projects with the potential for submission to research conferences.

Registration Prerequisites: Complete the following Google form to request a permission number from the instructor ( https://forms.gle/Y81FGaJivoqMQYtq5 ). Students are expected to have a solid foundation in either data visualization, computer graphics, computer vision, or HCI. Having expertise in all would be perfect! However, a robust interest and eagerness to delve into these subjects can be equally valuable, even though it means you need to learn some basic concepts independently.

Sustainable Computing: A Systems View

Meeting Time: 09:45 AM‑11:00 AM  Instructor: Abhishek Chandra Course Description: In recent years, there has been a dramatic increase in the pervasiveness, scale, and distribution of computing infrastructure: ranging from cloud, HPC systems, and data centers to edge computing and pervasive computing in the form of micro-data centers, mobile phones, sensors, and IoT devices embedded in the environment around us. The growing amount of computing, storage, and networking demand leads to increased energy usage, carbon emissions, and natural resource consumption. To reduce their environmental impact, there is a growing need to make computing systems sustainable. In this course, we will examine sustainable computing from a systems perspective. We will examine a number of questions:   • How can we design and build sustainable computing systems?   • How can we manage resources efficiently?   • What system software and algorithms can reduce computational needs?    Topics of interest would include:   • Sustainable system design and architectures   • Sustainability-aware systems software and management   • Sustainability in large-scale distributed computing (clouds, data centers, HPC)   • Sustainability in dispersed computing (edge, mobile computing, sensors/IoT)

Registration Prerequisites: This course is targeted towards students with a strong interest in computer systems (Operating Systems, Distributed Systems, Networking, Databases, etc.). Background in Operating Systems (Equivalent of CSCI 5103) and basic understanding of Computer Networking (Equivalent of CSCI 4211) is required.

  • Future undergraduate students
  • Future transfer students
  • Future graduate students
  • Future international students
  • Diversity and Inclusion Opportunities
  • Learn abroad
  • Living Learning Communities
  • Mentor programs
  • Programs for women
  • Student groups
  • Visit, Apply & Next Steps
  • Information for current students
  • Departments and majors overview
  • Departments
  • Undergraduate majors
  • Graduate programs
  • Integrated Degree Programs
  • Additional degree-granting programs
  • Online learning
  • Academic Advising overview
  • Academic Advising FAQ
  • Academic Advising Blog
  • Appointments and drop-ins
  • Academic support
  • Commencement
  • Four-year plans
  • Honors advising
  • Policies, procedures, and forms
  • Career Services overview
  • Resumes and cover letters
  • Jobs and internships
  • Interviews and job offers
  • CSE Career Fair
  • Major and career exploration
  • Graduate school
  • Collegiate Life overview
  • Scholarships
  • Diversity & Inclusivity Alliance
  • Anderson Student Innovation Labs
  • Information for alumni
  • Get engaged with CSE
  • Upcoming events
  • CSE Alumni Society Board
  • Alumni volunteer interest form
  • Golden Medallion Society Reunion
  • 50-Year Reunion
  • Alumni honors and awards
  • Outstanding Achievement
  • Alumni Service
  • Distinguished Leadership
  • Honorary Doctorate Degrees
  • Nobel Laureates
  • Alumni resources
  • Alumni career resources
  • Alumni news outlets
  • CSE branded clothing
  • International alumni resources
  • Inventing Tomorrow magazine
  • Update your info
  • CSE giving overview
  • Why give to CSE?
  • College priorities
  • Give online now
  • External relations
  • Giving priorities
  • Donor stories
  • Impact of giving
  • Ways to give to CSE
  • Matching gifts
  • CSE directories
  • Invest in your company and the future
  • Recruit our students
  • Connect with researchers
  • K-12 initiatives
  • Diversity initiatives
  • Research news
  • Give to CSE
  • CSE priorities
  • Corporate relations
  • Information for faculty and staff
  • Administrative offices overview
  • Office of the Dean
  • Academic affairs
  • Finance and Operations
  • Communications
  • Human resources
  • Undergraduate programs and student services
  • CSE Committees
  • CSE policies overview
  • Academic policies
  • Faculty hiring and tenure policies
  • Finance policies and information
  • Graduate education policies
  • Human resources policies
  • Research policies
  • Research overview
  • Research centers and facilities
  • Research proposal submission process
  • Research safety
  • Award-winning CSE faculty
  • National academies
  • University awards
  • Honorary professorships
  • Collegiate awards
  • Other CSE honors and awards
  • Staff awards
  • Performance Management Process
  • Work. With Flexibility in CSE
  • K-12 outreach overview
  • Summer camps
  • Outreach events
  • Enrichment programs
  • Field trips and tours
  • CSE K-12 Virtual Classroom Resources
  • Educator development
  • Sponsor an event

IMAGES

  1. Learning Environment Research Paper Example

    learning environment research papers

  2. (PDF) Factors Affecting the Design and Development of a Personal

    learning environment research papers

  3. (PDF) How to Write Ecology Research Papers

    learning environment research papers

  4. (PDF) Learning Environment Perceptions and Student Background Variables

    learning environment research papers

  5. (PDF) Towards Consensus on the School Library Learning Environment: A

    learning environment research papers

  6. Environmental Issues For Research Papers

    learning environment research papers

VIDEO

  1. Reinders, H. Research agenda: Language learning beyond the classroom

  2. Learning Environment: Reflections and Thoughts

  3. Education research & lifelong learning via industry partnerships: Case studies in Engineering

  4. Atari 2600 Pong Reinforcement Learning demo

  5. Creating an Effective Learning Environment: Part 2 (Video #3 in Neuroeducation 101 Series)

  6. Transforming the Educational Landscape in a Post-COVID Era

COMMENTS

  1. Home

    Overview. Learning Environments Research builds our understanding of pre-primary, primary, high school, college and university, and lifelong learning environments irrespective of subject area. Apart from classroom and school environments, the journal devotes special attention to out-of-school learning environments such as the home, science ...

  2. What are the key elements of a positive learning environment

    Introduction. The learning environment (LE) comprises the psychological, social, cultural, and physical setting in which learning occurs and in which experiences and expectations are co-created among its participants (Rusticus et al., 2020; Shochet et al., 2013).These individuals, who are primarily students, faculty and staff, engage in this environment and the learning process as they ...

  3. PDF Active learning classroom design and student engagement: An ...

    While student‐centered instruction can occur in any style classrooms, active learning classrooms (ALCs) are purposefully designed to promote student engagement in the learning process (Adedokum et al., 2107; Baepler et al., 2016; Freeman et al., 2014; Wiltbank et al., 2019).

  4. The home learning environment and its role in shaping children's

    Introduction. Over the past three decades, a growing number of studies have provided empirical evidence that the home learning environment (HLE) is an important predictor of differences in children's academic and social development (e.g., most recently, Rose, Lehrl, Ebert, & Weinert, Citation 2018; Tamis-LeMonda, Luo, McFadden, Bandel, & Vallotton, Citation 2019).

  5. The Evolution of the Field of Learning Environments Research

    My first introduction to the field of learning environments was in the early 1970s when I was undertaking research for my Ph.D. involving an evaluation of Australia's first national curriculum project, namely, the Australian Science Education Project [].This research was guided and inspired by Herbert Walberg's evaluation of Harvard Project Physics in the 1960s in the USA [2,3].

  6. Frontiers

    Previous research revealed the connection between students' behavioral and emotional engagement and a supportive classroom environment. One of the primary tools teachers have to create a supportive classroom environment is effective feedback. In this study, we assessed the supportive classroom environment using the perception shared by all students from the same classroom of teachers' use ...

  7. Shaping the future learning environments with smart elements

    This thematic issue entitled Future Learning Environment: Pedagogical and Technological perspectives aims to report the latest research findings and share good practices on creating brand new learning environments that emphasize learning effectiveness, efficiency, flexibility and engagement. It started to invite submissions in April 2020.

  8. Learning Environment Perceptions and Student Background Variables as

    Her research interests include motivation and persistence in foreign language study, teaching foreign language productive skills, learning environment, and curriculum development and instruction. She has several publications related to teacher education systems, authentic instruction, instructional design, teaching English as a foreign language ...

  9. (PDF) Learning Environment Research Review of Literature

    The Classroom Environment Scale (CES), like the LEI, is a learning environment instrument. that measures " perceptions rather than intelligence, personality or interests" (Fraser, 1982). In ...

  10. Adaptive e-learning environment based on learning styles ...

    This research attempts as well to outline and compare the proposed adaptive e-learning environment with a conventional e-learning approach. The paper is based on mixed research methods that were used to study the impact as follows: Development method is used in designing the adaptive e-learning environment, a quasi-experimental research design ...

  11. PDF Evaluation of the Effect of Learning Environment on Student's Academic

    Nigeria using a secondary research approach. This paper therefore examines the concept of ... learning environment and that of students taught in a dull learning environment.Adamu (2015) ... unqualified teachers and less enabling environment. In another research by Duruji,Azuh and Oviasogie,(2014),which examines the impact of learning ...

  12. THE EFFECT OF LEARNING ENVIRONMENT ON ACADEMIC ...

    Aims: This paper described the availability of students' learning environment in physical education under modular instruction in terms of exercise and dance in a public school in Negros Occidental ...

  13. PDF Effect of Classroom Learning Environment on Students' Academic ...

    learning environment are an important area of research since 1960s around the world. At present, there are two main approaches being used for the assessment of classroom learning environment. One approach is based on observation of classrooms or general kinds of activities by external observers.

  14. Outdoor learning in early childhood education: exploring benefits and

    An outdoor learning environment is perceived as either a purposefully constructed space or a natural, untouched setting that allows learners to engage in genuine and experiential learning encounters. ... As a result of our close analysis of the 20 selected papers, the noted benefits and challenges of implementing outdoor learning in early ...

  15. PDF Research Into Identifying Effective Learning Environments

    forms of learning and research or uses ICT to optimise capital planning or property management. The DesignShare awards have six categories of which only one focussed on the learning environment - 'enhance teaching and learning and accommodate the needs of all learners'. This incorporated a number of Figure 1. DEST Australia

  16. Perception of the learning environment among the students in a nursing

    Background Learning environment is an important base for learning processes of students and for preferences of future workplaces. It is considered as an essential factor in determining the success of an effective curriculum and the students' academic achievements. This study attempts to assess the perception of learning environment among the nursing students. Methods A descriptive cross ...

  17. What really matters for successful research environments? A realist

    Introduction. Research environments matter. Environmental considerations such as robust cultures of research quality and support for researchers are thought to be the most influential predictors of research productivity.1, 2 Over 25 years ago, Bland and Ruffin1 identified 12 characteristics of research‐favourable environments in the international academic medicine literature spanning the ...

  18. PDF Influence of Learning Environment on Students' Academic ...

    1) To determine the extent at which learning environment can affect performance in senior secondary school mathematics. 2) To determine the difference in learning environment and its influence(s) on students' academic achievement in mathematics. 3) To identify factors within the learning environment that affect students' academic ...

  19. Education Research Paper on Learning Environments

    This sample education research paper on Learning Environments features: 6900 words (approx. 23 pages) and a bibliography with 18 sources. Browse other research paper examples for more inspiration. If you need a thorough research paper written according to all the academic standards, you can always turn to our experienced writers for help.

  20. Optimizing a Bucket Filling Strategy for Wheel Loaders Inside a ...

    The world model is a fast surrogate simulator, and we use it to create a dream environment where an RL agent explores and optimizes its bucket filling behavior. We then deploy it on a full-size wheel loader without modifications, and demonstrate that it is capable of outperforming the previous benchmark controller, namely, a controller ...

  21. Digital Twin Research on Masonry-Timber Architectural Heritage ...

    Due to various factors such as aging, natural environment erosion, and man-made destruction, architectural heritage has formed various diseases and cracks, especially in pathology cracks, which are the most typical masonry-timber architectural heritages, directly affecting the structural stability of masonry-timber buildings. This paper uses artificial intelligence and architecture and ...

  22. Fall 2024 CSCI Special Topics Courses

    Visualization with AI. Meeting Time: 04:00 PM‑05:15 PM TTh. Instructor: Qianwen Wang. Course Description: This course aims to investigate how visualization techniques and AI technologies work together to enhance understanding, insights, or outcomes. This is a seminar style course consisting of lectures, paper presentation, and interactive ...