Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

Transforming students’ attitudes towards learning through the use of successful educational actions

Roles Data curation, Formal analysis, Methodology, Validation, Writing – original draft, Writing – review & editing

* E-mail: [email protected]

Affiliation Department of Linguistic and Literary Education, and Teaching and Learning of Experimental Sciences and Mathematics, University of Barcelona, Barcelona, Spain

ORCID logo

Roles Conceptualization, Investigation, Methodology, Writing – review & editing

Affiliation Faculty of Psychology and Education, University of Deusto, Ikerbasque, Basque Foundation for Science, Bilbao, Spain

Roles Conceptualization, Formal analysis, Investigation, Methodology, Supervision, Validation, Writing – review & editing

Affiliation Faculty of Education, University of Cambridge, Cambridge, United Kingdom

Roles Investigation, Project administration, Writing – review & editing

Affiliation CREA–Community of Research of Excellence for All, University of Barcelona, Barcelona, Spain

  • Javier Díez-Palomar, 
  • Rocío García-Carrión, 
  • Linda Hargreaves, 
  • María Vieites

PLOS

  • Published: October 12, 2020
  • https://doi.org/10.1371/journal.pone.0240292
  • Peer Review
  • Reader Comments

Table 1

Previous research shows that there is a correlation between attitudes and academic achievement. In this article, we analyze for the first time the impact of interactive groups (IG) and dialogic literary gatherings (DLG) on the attitudes that students show towards learning. A quantitative approach has been performed using attitude tests validated by previous research. The data suggest that in both cases, the participants show positive attitudes. The social context has an important influence on students’ attitudes. The items with higher correlations include group work, mutual support, and distributed cognition. In the case of IGs, group work is much more appreciated, while in the case of DLGs, self-image and self-confidence are the two most clearly valued attitudes. The positive impact of IGs and DLGs on students’ attitudes may have potential for teachers in transforming their practices and decision-making within the classroom.

Citation: Díez-Palomar J, García-Carrión R, Hargreaves L, Vieites M (2020) Transforming students’ attitudes towards learning through the use of successful educational actions. PLoS ONE 15(10): e0240292. https://doi.org/10.1371/journal.pone.0240292

Editor: Christian Stamov Roßnagel, Jacobs University Bremen, GERMANY

Received: April 9, 2020; Accepted: September 24, 2020; Published: October 12, 2020

Copyright: © 2020 Díez-Palomar et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: All relevant data are within the manuscript and its Supporting Information files.

Funding: JDP, RGC, LH and MVC want to acknowledge the funding provided by the EU Commission under the grant num. 2015-1-ES01-KA201-016327, corresponding to the project Schools as Learning Communities in Europe: Successful Educational Actions for all, (SEAS4ALL), under the program ERASMUS +; and the Spanish Ramón y Cajal Grant RYC-2016-20967 for open access publication of the article.

Competing interests: No authors have competing interests.

Introduction

In this article, we address the following research question: What impact does participation in interactive groups (IGs) and dialogic literary gatherings (DLGs) have on the attitudes that students show towards learning? To define “attitudes”, we draw on the definition of Harmon-Jones, Harmon-Jones, Amodio and Gable [ 1 ], who characterize attitudes as “subjective evaluations that range from good to bad that are represented in memory” [ 1 ]. This definition is also consistent with the classic definitions about “attitudes” used in social psychology studies [ 2 ].

Previous research suggests that there is a clear relationship between students’ attitudes and their academic achievement [ 3 , 4 ]. Decades ago, classic studies in the field of educational research [ 5 ] concluded that teachers’ expectations about students’ attitudes and behaviors may explain students’ effective academic achievement. According to [ 4 ], the process of the “social construction of identity” explains why there are students who seem destined to obtain poor academic results. Drawing on the theoretical approach developed by Mead [ 6 ], Molina and her colleagues [ 4 ] argue that the way in which a student defines his/her own identity determines his/her own learning expectations and, as a consequence, his/her own academic career. In this sense, the process of constructing identity is social in essence.

According to Mead [ 6 ], the self emerges as a result of social interaction with other people who project their expectations and attitudes on the individual. The identity of a person is formed by two components. The first component is the me , which is of social origin and incorporates the attitudes of others about the individual; the second component is the I , which is the conscious reaction of each individual to those attitudes. A process is thus created in which identity is the result of the dialogue between the individual and others. This somehow explains why some students end up developing an identity as bad students, while others develop an identity as good students. This process has been called the “Pygmalion effect” by educational researchers [ 5 ]. As Flecha [ 7 ] suggests, drawing on successful educational actions (SEAs), teachers get their students to achieve better results, and that, in turn, explains why these students improve their self-concept as students (i.e., their me and I , in Mead’s terms). However, does that mean that they also change their attitudes towards learning in school?

Classic studies such as Learning to Work [ 8 ] suggest that students with low academic performance tend to be children who reject school. These students tend to manifest that feeling of rejection in wayward attitudes. These children also do not see school as a desirable or attractive place. In contrast, they show an attitude of rejection and resistance towards schooling, which is accompanied by low academic performance. Later researchers, such as Bruner, have suggested that this attitude is the result of the failure of the schools to respond to the expectations of these children (and their families) [ 9 ]. Students’ identities are defined in other spaces and with other references. This may have a negative impact on the ability of teachers to teach. Various studies have suggested that as some of these children grow, their interest in school diminishes. This happens especially in the transition between primary and secondary school, as some students lose interest in science, mathematics and other subjects. Given this situation, researchers have found that learning initiatives located outside of the school can change these attitudes towards learning, as in the case of visiting museums, laboratories, or research centers [ 10 ].

This article discusses the impact of participating in two educational activities previously defined as successful educational actions (SEAs) [ 7 ] on attitudes towards learning shown by students who have participated in these actions. Thus far, we have clear evidence of the positive impact that SEAs have on learning outcomes [ 11 – 15 ], and there are studies suggesting that there is also a positive effect on the coexistence and cohesion of the group-class [ 16 , 17 ]. However, previous studies have not explored the impact that such SEAs may have on students’ own attitudes. Therefore, this article discusses this dimension of learning, which, as the studies mentioned above claim, is a relevant aspect to understanding how learning works.

Theoretical framework

Attitudes and learning.

There is an assumed belief in education that there is a direct relationship between student attitudes and academic achievement. Renaud [ 18 ] distinguishes between dispositions and attitudes by stating that the former is more “resistant” to change than the latter. Dispositions are defined as “more general and enduring characteristics”, while attitudes are tendencies or internal states of the person towards anything that a person can evaluate, such as “learning math, extracurricular activities, or the general notion of going to school” [ 18 ]. Previous research that exists on attitudes and learning has found that there is a clear relationship between both aspects. Renaud [ 18 ] quotes literature reviews that indicate that there is a correlation between attitude and achievement in mathematics [ 19 – 21 ] and in science [ 22 ]. According to previous research, the relationship becomes stronger at higher educational levels [ 22 – 24 ].

Ma and Kishor [ 19 ] analyze three indicators that refer to attitude to evaluate their impact on academic achievement: the self-concept about mathematics, the support of the family and the gender role in mathematics. According to their data, the most important correlation corresponds to the self-concept (p. 24). Masgoret and Gardner [ 23 ] found that motivation is more closely related to academic achievement than attitude. Attitude, on the other hand, seems to be related to achievement; however, it is related indirectly through motivation. Motivation and self-concept present a clear relationship (a correlation exists), but the research is not conclusive in regard to which direction the relationship works, i.e., we do not know (yet) if it is the motivation that gives rise to the person’s positive self-concept or if having a positive self-concept translates into an increase in terms of motivation. In any case, both variables seem to correlate directly with academic achievement; higher motivation and self-concept are associated with better academic results (in general terms).

The “symbolic interactionism” approach

The theoretical approach that has devoted the most effort to analyzing the relationship between attitudes, motivation, self-concept and academic achievement is that of symbolic interactionism. George H. Mead [ 6 ] is one of the best-known representatives of this theory. According to his findings, self-concept is of social origin. “The self is something which has a development; it is not initially there at birth” [ 6 ]. To explain this process, Mead proposes the concept of the generalized other. According to him, the generalized other is “the organized community or social group which gives to the individual his or her unity of self” [ 6 ]. Mead illustrates how this concept works to create the self-concept by drawing on the game metaphor. He uses the example of baseball. The baseball team is what Mead calls the generalized other. Each player has a specific role within the team, and he or she acts in accordance with what is expected of him or her in that position. The rest of the team does the same, so that individual actions are defined and carried out within the more global unit that defined is the team (i.e., the generalized other). Using this example and others, Mead [ 6 ] was able to show that the self is the result of a social process. Similarly, Vygotsky [ 25 ] claimed that higher psychological functions emerge through interpersonal connections and actions with the social environment until they are internalized by the individual.

Mead states as follows:

It is in the form of the generalized other that the social process influences the behavior of the individuals involved in it and carrying it on, i.e., that the community exercises control over the conduct of its individual members; for it is in this form that the social process or community enters as a determining factor into the individual’s thinking. [ 6 ]

This social process involves interaction with other individuals in the group through shared activities. The classroom is the perfect example of a group. The teacher and the students are part of a social group with defined norms [ 26 ] as well as an institutional objective (teaching and learning), where each “player” performs a specific role according to those (declared or implicit) norms.

In some investigations, this social unit has been defined as a “community of practice” [ 27 , 28 ]. Brousseau [ 29 ] uses the concept of “contracte didactique” to characterize this social unit and analyze its functioning in the mathematics classroom. According to Brousseau, there is a relationship between the different actors (individuals) participating in the mathematics classroom, in which each plays a specific role and has specific responsibilities. The teacher has the obligation to create sufficient conditions for the appropriation of knowledge by the students and must be able to recognize when this happens. Similarly, the responsibility of the students is to satisfy the conditions created by the teacher. Brousseau studies how the teacher performs what he calls the didactic transposition of scientific knowledge to be taught in school. S/he has to identify the epistemological obstacles and the cognitive obstacles that make it difficult or students to learn in the classroom.

However, other studies have suggested that there are factors of another nature (neither cognitive nor epistemological) that also influence the academic achievements reached by students. This is the case with interactions [ 16 , 30 , 31 ]. As Mead [ 6 ] suggested, the self-concept created by an individual is the result of the internalization of the expectations that each individual has of himself or herself by the role s/he plays in the group to which s/he belongs. For example, the student who always tries hard and answers the teacher’s questions is fulfilling his or her role within the good student group. The group expects him or her to play that role. It is part of his/her identity. In addition, s/he acts accordingly. The effect of the positive or negative projection of expectations on students has been widely studied in education [ 32 , 33 ]. What we know is that teachers have to be cautious and try not to project negative expectations on students because that has a clear effect on their academic achievements, giving rise to well-studied interactions such as the Pygmalion effect, or the self-fulfilling prophecy [ 5 ].

However, the impact of successful educational actions [ 7 ] on the attitudes that students have towards learning at school in the context of interactive groups and dialogic literary gatherings has not been studied so far. This impact is what is discussed in this article.

The successful educational actions of interactive groups and dialogic literary gatherings

The research question discussed in this article is framed in the context of the implementation of two successful educational actions identified by the European Commission in the research project titled INCLUD-ED : Strategies for inclusion and social cohesion from education in Europe [ 34 ]. A successful educational action is defined as an action carried out in the school, the result of which significantly improves students’ learning [ 7 ]. The two successful educational actions that are discussed herein are interactive groups and dialogic literary gatherings.

Interactive groups.

Interactive groups (IGs) consist of a particular group-based teaching practice in which students are put together in small groups of approximately six or seven students, with an adult person facilitating the task. IGs must be heterogeneous in terms of their composition, including children with different ability levels, gender, socioeconomic background, etc. The adult person (the facilitator) is a volunteer who encourages dialogic interaction among the group members while performing the task designed by the teacher. The teacher splits the students among four or five IGs (depending on the number of children in the classroom and the time available for the lesson). Each group of students has a task assigned, which the teachers have previously designed. There is a total of four or five tasks (the same number as the number of IGs). The assignments are about the subject that is being focused on in the lesson plan (i.e., mathematics, language, science, history, etc.). To perform the assigned task, groups have fifteen or twenty minutes of time (depending on the total time allocated for that activity in the school day). After this interval, the teacher asks the students to move to the next IG, where they will find another task. When the class is over, each of the children must have gone through all of the tasks. All of the children perform four or five different tasks designed by the teacher to cover the curriculum requirements. In some schools, the kids move from one IG to the next. In other cases, the teachers prefer to ask the facilitators to move between the groups to avoid the noise and disorder created by the children getting up and moving to the next table (task).

The facilitators never provide solutions to the tasks executed by the students. Instead, they encourage students to share, justify, explain, their work to their group mates. Their responsibility is encouraging students to use dialogic talk [ 15 ], which is based on the principles of dialogic learning [ 35 ]. Research evidence on IGs suggests that using dialogic talk increases participants’ chances of improving their academic achievements [ 15 , 30 , 36 ]. When students are asked by the facilitator to justify their answers to a task, they need to conceptually defend their claims; this implies that they must be able to not only understand the concept or concepts embedded in the assigned tasks but also explain them to their group mates. The type of talk (speech) that appears when children engage in this type of interaction has been defined as dialogic talk [ 30 ] because it is oriented towards validity claims [ 37 ], not towards the power position that children occupy within the group.

Dialogic literary gatherings.

Dialogic literary gatherings (DLGs) are spaces in which students sit in a circle and share the reading of a classic literary book. The gathering is facilitated by the teacher, whose role is not to intervene or give his/her opinion, but to organize the students’ participation by assigning them turns. Every child who wants to share his/her reading raises his/her hand and waits until the teacher gives him/her a turn. Readings come from classic literature, such as works by Shakespeare, Cervantes, Kafka, Tagore, etc. [ 38 , 39 ]. Students read at home the assigned number of pages (it either could be a whole chapter or a certain number of pages, according to the teacher’s criteria). When reading the assignment, the student highlights a paragraph and writes down the reason for his/her choice. Then, during the DLG session in school, the children bring the paragraph or paragraphs they want to share with the rest of their classmates. At the beginning of the session, the teacher asks who wants to share his/her paragraph. S/he writes down the name of the students offering to share on a list. Then, the teacher starts with the first name on the list and that student reads aloud his/her paragraph; s/he also identifies on which page of the text it is so that the rest of the participants in the DLG can follow the reading and explains the reason for his/her choice. After the reading, the teacher opens the floor for questions. S/he always prioritizes the children who participate less often. When the teacher considers that the topic has been sufficiently commented on either because the idea that led to the intervention has been fully commented on or because the children's questions drifted to other irrelevant topics, s/he moves to the next name on the list. The process is repeated until the session ends.

Children, when talking about their paragraph, become involved in a process called “dialogical reading” [ 40 ], which is based on the application of Bakhtin’s concept of “dialogism” [ 41 ]. Bakhtin explains this concept using the idea of “polyphony” to refer to the use of multiple voices in a narrative work, such as the case of Dostoevsky’s poetics [ 41 ]. According to Bakhtin, no voice is the result of a single speech but rather it integrates different voices. This concept has been reused and reinterpreted in educational research. Drawing on those authors, knowledge is the result of internalizing the voices of multiple people (teachers, family members, friends, classmates, and other people) that we have encountered throughout our lives. DLGs recreate that multiplicity of voices through the dialogues that generate a space in which all children contribute with their opinions, ideas, and understandings about the paragraph being discussed. In this sense, reading understanding develops in a much deeper way than if the child had to read the material individually because s/he can incorporate the points of view of his/her peers into his/her own final comprehension.

Methodology

The data used to discuss the research question come from a research project titled SEAs4All–Schools as Learning Communities in Europe . The dataset has been submitted to this journal as supporting data for public use. Six schools from the four European countries of Cyprus, United Kingdom, Italy and Spain participated in this project. Five of them were primary schools, and the last one was a middle/high school. All of the schools were selected because they applied successful educational actions (SEAs) [ 7 ]. After implementing IGs and DLGs, a survey was conducted in three of the schools to evaluate the impact of using these two types SEAs on students’ attitudes and perceptions towards learning. Children between 7 and 11 years old participated in the survey. Two of the surveyed schools are located in the United Kingdom, and the third one is located in Italy. All of the schools are located in different contexts. One of the schools in the United Kingdom is in an area where families have a high economic status and high cultural capital (Cambridge), while the other English school is located in a neighborhood considered to be of a medium-level SES (Norwich). The Italian school is located in a low SES area of Naples. A total of 418 children participated in the survey (251 participating in DLGs and 167 engaging in IGs), as shown in Table 1 .

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

https://doi.org/10.1371/journal.pone.0240292.t001

To collect the data, the SAM questionnaire, developed at the Universities of Leicester and Cambridge, UK, was used as a model for the evaluation of the impact of the implemented educational actions. The original SAM questionnaire consists of 17 items that are measured using a 5-point Likert scale, which ranges between “strongly agree” and “strongly disagree.” The questionnaire used in the current study was amended by drawing on previous results from a pilot test and was thus reduced to 12 items [ S1 File ].

The children answered a paper version of the questionnaire. The data were then coded and entered into an Excel matrix that was later used to analyze the data in SPSS (version 25.0). To debug the database and detect possible errors in the transcription, univariate descriptive analysis was conducted using the table of frequencies for each item to check that all codes and weights were aligned with the data collected through the paper questionnaires. Whenever an anomaly was detected, we proceeded to review the original questionnaire on paper to verify the information and data transcribed in the matrix.

To analyze the data, a descriptive report was first made by tabulating the data in frequency tables using the mean, median and mode, as well as the variance and standard deviation.

attitude towards learning essay

https://doi.org/10.1371/journal.pone.0240292.t002

Before performing Bartlett’s test, four of the items were recoded (#1, #3, #4, and #6) since the grading vector of the Likert scale used in these three items went in the opposite direction as that used for the rest of the items. These four items were displayed in a negative tone (i.e., “we learn best when the teacher tells us what to do”, “learning through discussion in class is confusing”, “sometimes, learning in school is boring”, and “I would rather think for myself than hear other people’s ideas”), unlike the rest of the items in which the tone of the answers was positive. Therefore, the labels of “strongly agree”, “agree a little”, “not sure”, “disagree a little”, and “strongly disagree” for items #1, #3, #4, and #6 referenced to a scale with a negative associated vector, whereas for the rest of the items, the same labels refer to a positive vector. For this reason, the responses of these four variables were recoded into four new variables that reversed the original direction of the response vector.

In both cases, (IGs and DLGs), Bartlett’s test suggests that we can accept the null hypothesis, which means that we can use factor analysis to discriminate which principal components are the ones that explain the greatest percentage of the variance. The results of this analysis are discussed below.

Ethic statement

The studies involving human participants were reviewed and approved by Ethics Committee of the Community of Research on Excellence for All, University of Barcelona. Schools collected the families’ informed consent approving the participation of their children in this study.

The Ethics Board was composed by: Dr. Marta Soler (president), who has expertise within the evaluation of projects from the European Framework Program of Research of the European Union and of European projects in the area of ethics; Dr. Teresa Sordé, who has expertise within the evaluation of projects from the European Framework Program of Research and is a researcher of Roma studies; Dr. Patricia Melgar, a founding member of the Catalan Platform against gender violence and researcher within the area of gender and gender violence; Dr. Sandra Racionero, a former secretary and member of the Ethics Board at Loyola University Andalusia (2016–2018); Dr. Cristina Pulido, an expert in data protection policies and child protection in research and communication and researcher of communication studies; Dr. Oriol Rios, a founding member of the “Men in Dialogue” association, a researcher within the area of masculinities, as well as an editor of “Masculinities and Social Change,” a journal indexed in WoS and Scopus; and Dr. Esther Oliver, who has expertise within the evaluation of projects from the European Framework Program of Research and is a researcher within the area of gender violence.

Students’ attitudes towards learning

The data collected suggest that the students who participate in the IGs and the DLGs have positive attitudes towards learning in general terms. Table 3 indicates that the answers for almost all the items are clearly positive; this is true for between 75% and 80% of the responses, except for three items (#3, #4, and #6). This outcome is understandable since in these three items, the interviewer changed the meaning of the question, i.e., instead of phrasing the questions positively, as with the rest of the items, the questions were phrased negatively, with the expected outcome being that the positive trend in the answers would be reversed, which is what occurred. Surprisingly, in the case of item #1, which we expected to function similar to items #3, #4 and #6, the responses are aligned with the rest of the items.

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t003

The answers to item 1 (“We learn best when the teacher tells us what to do”) may indicate an active role by the teacher, which a priori would not be the expected answer in the context of using IGs and DLGs. In contrast, what we would expect in that context is for students to show a preference for answers related to an active role of the student, which is the case for the rest of the items analyzed. However, the fact that the respondents also claim that they learn better when the teacher tells them what to do either suggests that there is a bias in the student responses that is either due to what Yackel and Cobb [ 26 ] call the “norms”, which are also theorized as the “didactic contract” in Brousseau’s terms [ 29 ] and which regulate the social interactions within the classroom, or because the role of the teacher as a leader is recognized by these students.

Table 4 summarizes the previous results in two categories (agree and disagree). The trend noted above can now be clearly seen.

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t004

Items #4 and #10 are crucial to understanding the attitudes that students have towards learning. In the first case, half of the respondents contrarily claim that learning is a boring activity. If we assume that for a boy or a girl between 7 and 11 years old, defining an activity as fun or boring can be a clear way of indicating their attitude towards that activity, the fact that half of the students participating in the survey declare that learning is not boring suggests that their participation in doing mathematics in the IGs and DLGs makes these two activities in some way attractive to them.

On the other hand, another relevant aspect regarding the students’ attitudes is the feeling of self-confidence. Previous research has provided much evidence that suggests the importance of this aspect in the attitudes that children can have towards learning [ 5 ]. Boys and girls who have confidence in themselves tend to show a clearly positive attitude towards learning. The data suggest that this is what happens when students participate within IGs or DLGs, i.e., three out of four children affirm that they feel more self-confident with regard to learning in school than they normally do. This result is relevant because it suggests that both IGs and DLGs have a clear impact on the positive transformation of attitudes towards learning. The data show that this is true for children in the three schools that participated in the survey, regardless of the country or the context in which they are located.

Principal components analysis

The KMO test indicates whether the partial correlations between the variables are small enough to be able to perform a factorial analysis. Table 2 shows that in this case, the KMO test has a value of 0.517 for the students participating in IGs and 0.610 for the students participating in the DLGs, which allows us to assume (although with reservation) that we can perform a factor analysis to find the principal components explaining the variance. Bartlett’s sphericity test (which contrasts the null hypothesis assuming that the correlation matrix is, in fact, an identity matrix, in which case we cannot assume that there are significant correlations between the variables) yields a critical value of 0.000 in both cases, which suggests that we can accept the null hypothesis of sphericity and, consequently, that we can think that the factorial model is adequate to explain de data.

After performing ANOVA several times, considering the several items in the tested models, we managed to find two models (one for the students who had participated in the IGs and another for those engaged in the DLGs) that explained more than half of the variance. Tables 5 and 6 introduce the obtained results.

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t005

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t006

As Tables 5 and 6 indicate, the items that are included in the SAM test contribute to better explaining the attitudes that the students participating in the study have regarding DLGs than those they have regarding the IGs. For the DLGs, we observed that there are four components above 1, explaining 74.227% of the variance. In contrast, in the case of IGs, we find only two components above the value of 1, which together explain only 63.201% of the variance. This suggests that the SAM test is probably the best instrument to measure attitudes in the case of the DLGs.

The sedimentation graphs make it easier to visualize this result. In the left image of Fig 1 (the sedimentation graph obtained for the IGs), a clear inflection is observed from component 2. In contrast, in the case of the DLGs, the inflection occurs from component four and onward.

thumbnail

https://doi.org/10.1371/journal.pone.0240292.g001

The matrix of components suggests that, in the case of the IGs, factor 1 is formed by the components that we can label as “active peer-support” (#9 “Helping my friends has helped me to understand things better”) and “active listening” (#8 “It is good to hear other people’s ideas”). Factor 2, on the other hand, is formed by the component of “participation” (#2 “We can learn more when we can express our own ideas”). For the IGs, the component “individualism” (#6 “I would rather think for myself than hear other people’s ideas”) is clearly the least explanatory (-0.616), which is a fact that seems to suggest that collaboration within the groups is a fundamental aspect of the learning dynamic occurring within them. Table 7 shows that the most explanatory factor of the variance is the first factor. On the other hand, factors 2, 3 and 4 are less important since their weights are almost irrelevant.

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t007

When we observe the results for the case of the DLGs, the component matrix ( Table 8 ) indicates that factor 1 is formed mainly by the components of “positive discussion” (#11 “I like discussing the books we read with the class”), “self-confidence in school” (#10 “I am more confident about learning in school than I used to be), and “participation” (#2 “We can learn more when we can express our own ideas”). In contrast, Factor 2 contains a single component (#1 “We learn best when the teacher tells us what to do”). In the case of factor 3, the more explanatory component is the sixth component (#6 “I would rather think for myself than hear other people’s ideas”). Finally, factor 4 includes the third component of the SAM test (#3 “Learning through discussing in class is confusing”).

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t008

Fig 2 shows the graphs of the loading scores for each component in a rotated space, both for the IGs and the DLGs. The data confirm the interpretation of the previous tables. The graphs show that for the IGs (the left side of Fig 2 ), variables #8 and #9 tend to explain the maximum variance of factor 1, while in the case of DLGs, the three-dimensional component chart shows the two slightly differentiated groups of variables.

thumbnail

https://doi.org/10.1371/journal.pone.0240292.g002

Construction of subscales of attitudes towards learning in IGs and DLGs

The collected data allow us to think that the variables obtained with the SAM test may explain the attitudes that students have towards learning in the context of IGs and DLGs. However, according to previous theoretical models, it would seem plausible to assume that not all variables are equally precise in the explanation of the attitudes towards learning showed by the students interviewed in both contexts. For this reason, in this section, we compare two possible scales for each context (IGs and DLGs) to identify which variables would be more reliable in explaining those attitudes.

In the case of the IGs, we created two subscales. The first subscale (Tables 9 – 11 ) includes items #1 (“We learn best when the teacher tells us what to do”), #7 (“I enjoy learning when my friends help me”) and #10 (“I am more confident about learning in school than I used to be”). In contrast, subscale 2 (Tables 12 – 14 ) incorporates items #2 (“We learn more when we can express our own ideas”), #5 (“Learning in school is better when we have other adults to work with us”) and #11 (“I like discussing the books we read with the class”). Table 9 shows the Cronbach’s alpha value for subscale 1, which is rather mediocre (0.412), while Table 12 indicates that subscale 2 is a much more reliable subscale (Cronbach’s alpha of 0.794), suggesting that the subscale 2 works better than the first one to characterize the components explaining the results obtained within the IGs. The difference between the two subscales is that in the first one, the role of the teacher is not included, while in subscale 2, item #5 (“Learning in school is better when we have other adults to work with us”) is the one that presents the highest correlation (0.613), as seen in Table 13 , which shows the interitem correlation matrix for the variables of subscale 2.

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t009

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t010

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t011

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t012

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t013

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t014

Regarding DLGs, we also created two subscales, i.e., subscales 3 (Tables 15 – 17 ) and 4 (Tables 18 – 20 ). Subscale 3 is formed by variables #1 (“We learn best when the teacher tells us what to do”), #5 (“Learning in school is better when we have other adults to work with us”), #10 (“I am more confident about learning in school than I used to be”) and #11 (‘I like discussing the books we read with the class”). In contrast, subscale 4 includes variables #2 (“We learn more when we can express our own ideas”), #7 (“I enjoy learning when my friends help me”), #8 (“It is good to hear other people’s ideas”) and #9 (“Helping my friends has helped me to understand things better”).

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t015

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t016

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t017

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t018

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t019

thumbnail

https://doi.org/10.1371/journal.pone.0240292.t020

Table 15 shows the results for subscale 3 (DLGs), including a high Cronbach’s alpha value (0.820), indicating that this subscale is a good proposal. According to data shown in Tables 16 and 17 , the subscale 3 works better when component #10 is removed from the model (increasing Cronbach’s alpha from 0,820 to 0,829). This result suggests that self-confidence is not a relevant component for attitudes towards participating in DLGs. Subscale 3 is better than subscale 4, which presents a Cronbach’s alpha value of 0.767, which, although high, is lower than that found in subscale 3.

The clearest difference between both subscales (3 and 4) is that the first one (subscale 3) includes item #11 (“I like discussing the books we read with the class”), which is the one item more focused on the context of the DLGs. The other items are more related to the interaction among the students (Tables 18 – 20 ). However, the amount of correlation explained is lower than the amount explained by subscale 3. This fact may explain why subscale 3 is more reliable in measuring students’ attitudes towards learning in social/interactional contexts, such as DLGs.

Discussion and conclusions

Previous research in education has provided enough evidence to claim that attitudes have a relevant impact on learning [ 42 – 46 ]. Studies such as those of Fennema and Sherman [ 3 ] have confirmed almost half a century ago that the “affective factors (…) partially explain individual differences in the learning of mathematics” [ 3 ]. Currently, we know that the results of learning depend, to a certain extent, on the attitudes that students have towards it. When there is a clear resistance to school and school practices, it is more difficult for students to achieve good results. Aspects such as motivation or self-concept, for instance, are relevant to explaining a positive attitude towards learning. These aspects often appear to be correlated [ 19 ]. When a child has a poor self-concept as a student, s/he often feels very unmotivated to learn in school. The literature reports numerous cases of students who actively or passively resist or even refuse to make an effort to learn their lessons because they felt that they cannot learn anything. In contrast, when students’ self-image is positive, then it is easier for them to learn. In those cases, the data provide evidence of positive correlations between learning achievements and attitudes. Rosenthal and Jacobson [ 5 ] called this type of behavior the Pygmalion Effect.

SEAs [ 7 ] such as IGs and DLGs are framed within the dialogic learning theory, one of whose main principals is that of transformation. As Freire [ 47 ] claimed, people “are beings of transformation, and not of adaptation.” Education has the capacity to create opportunities for people to transform themselves. Drawing on the assumption that “education needs both technical, scientific and professional training, as well as dreams and utopia” [ 47 ], SEAs integrate practices endorsed by the international scientific community to create real opportunities for learning for children. The data presented in the previous section suggest that children participating either in IGs or DLG have a clear positive attitude towards learning. Table 4 suggests that these children truly enjoy learning. Less than half of the participants say that learning at school is “boring” (41.8%). In contrast, almost eight out of ten children interviewed said they love learning (78.6%). The SAM test items, validated in previous studies [ 10 , 48 , 49 ], have been confirmed as components with which to measure children’s attitudes towards learning. For the first time in the context of studying the impact of actions included within the SEAs [ 7 ], we have been able to identify (and measure) the positive relationship between implementing SEAs, i.e., boys and girls engaged in the IGs and/or the DLGs showed a clear positive attitude towards learning. We can therefore claim that in the context of SEAs, students show positive attitudes towards learning. The data analyzed suggest that participating within IGs and DLGs empower students to transform their own attitudes towards learning.

On the other hand, we know that social contexts have a powerful influence on people’s attitudes. The context of positive empowerment, based on the idea of “maximum expectations” [ 50 , 51 ], is able to transform the attitudes that students have towards learning (especially those who are more resistant to learning and school). In contexts where school and school practices are not valued, children have to overcome the social tendency to openly show resistance against school (and everything that represents the school, such as teachers, attitudes of compliance with the school activities, norms, etc. ) and embrace a new tendency of valuing all these aspects. However, as previous studies framed within the symbolic interactionism approach have largely demonstrated [ 6 , 8 ], it is hard to turn against the social pressure of the group. We define our identity as a result of our interactions with others. If the group finds it attractive to resisting schooling, school norms and practices, then it is going to be difficult for individual students to achieve good academic results (unless they receive a different context from elsewhere) because they have to fight against the social pressure of not valuing school, in addition to the inherent difficulties of learning itself (in cognitive and curricular terms). In contrast, when the context is transformed (to adopt the terms of Freire and Flecha) and learning becomes a valued practice, children usually transform their attitudes, which previous research has correlated with successful learning achievement [ 14 , 15 , 33 ]. The data collected and discussed herein provide evidence for how changing the context (drawing on the two SEAs of IGs and DLGs) can transform students’ attitudes towards learning. As we stated in the previous section, 78.6% of the students participating in the survey claimed that they like to learn after participating in either IGs or in DLGs. They claim that they like “when my friends help me.” Along the same lines, 78.3% of the respondents said that “it is good to hear other people’s ideas,” while 76.7% claim that “helping my friends has helped me to understand things better.” This type of answer clearly demonstrates that IGs and DLGs create a context in which learning is valued positively. Attitudes such as solidarity, willingness to help the other, friendship seem to indicate the preference for a context that is oriented towards learning rather than resisting it. Hence, transforming the context also changes how individuals recreate their own identities using different values as referents, which, drawing on Mead [ 6 ], is how identity creation works. The evidence collected herein suggests that IGs and DLGs work to increase students’ academic performance because they transform the students’ context; hence, students transform their own attitudes (as expected according the theory of symbolic interactionism).

By analyzing more in detail what happens in both the IGs and the DLGs, we have been able to verify that the attitudes that emerge among the students participating either in the IGs or the DLGs are slightly different. In the case of the IGs, the data collected reveal that children value much more the collaborative work with the rest of their classmates, as seen in the answers to items #8 (“It is good to hear other people’s ideas”) and #9 (“Helping my friends has helped me to understand things better”). These two items are the main components explained by the variance detected. On the other hand, in the case of the DLGs, the ability to express one’s ideas is especially valued. In this case, the variance is explained above all by items #2 (“We learn more when we can express our own ideas”), #10 (“I am more confident about learning in school than I used to be”) and #11 (“I like discussing the books we read with the class”). The last component (#11) clearly belongs to a context similar to that of the DLGs. However, the two previous components (#2 and #10) suggest that participation in DLGs is related to the development of a positive self-image as learner. The chi-square test indicates that the correlations are significant in both cases. Therefore, the data obtained suggest that participating in IGs and/or DLGs is related to showing positive attitudes towards learning (both as an individual and as member of the group, i.e., in a social sense).

On the other hand, when analyzing the reliability of the results, it can be verified that in the case of the IGs, the most important correlation appears in the case of item #5 (“Learning in school is better when we have other adults to work with us”). This finding is very relevant since it constitutes empirical evidence of something that Vygotsky already suggested when he proposed his concept of ZPD, which is that in order for the process to work, there must be an adult or a more capable peer to help those who are learning achieve what they can with the support of these adults who act as facilitators. The difference between IGs and other collaborative learning groups is exactly that, i.e., in the IGs, there is always an adult who dynamizes the activity (who does not provide the answers but encourages the children to engage in a dialogic interaction [ 15 ]).

Regarding the DLGs, the most important correlation appears in the case of item #11 (“I like discussing the books we read with the class”), which is an aspect that makes sense in the context of the gatherings. Children affirm that they like to read books together with their other classmates. As we know, this activity has clear advantages from the point of view of the development of reading understanding [ 41 , 52 ].

A surprising finding is the high response rate to item #1 (“We learn best when the teacher tells us what to do”). This would seem to be inconsistent with using IGs or DLGs, where the role of the teacher is rather marginal or passive (the teacher organizes the activity but does not give answers, and they explain the academic content such as in a master’s class, etc.). Perhaps a possible reason to explain this result is that the school, as an institution, is characterized by a series of social norms [ 26 ]. Waiting for teachers’ directions is part of those norms. It is assumed that when attending the school, we must pay attention to what the teacher says. This idea corresponds to the social image of the teacher as a transmitter of knowledge, which is part of the social norm characterizing the school institution. It is possible that even though the children participating in this study have engaged in IGs and DLGs, there are not excluded from the norms of the social context, so that their attitudes are tinged with them.

We can therefore conclude that the SAM test demonstrates that children who participate in IGs and/or DLGs clearly show positive attitudes towards learning after participating in these two SEAs. Perhaps this is one of the fundamental variables explaining the successful learning results that other studies have found among children using SEAs [ 11 – 15 ].

Future implications

This research confirms some aspects of learning, while it leaves others open for further study. We have observed that children who participate in SEAs show positive attitudes towards learning. However, what we do not know (yet) is whether it is the use of these SEAs that explains why these children show these attitudes or if the transformation lies in other reasons. To clarify this lack of information, it is necessary to conduct further experimental research comparing groups of students using SEAs and other groups of students using other types of educational actions.

On the other hand, the data that we have discussed herein suggest that there is a social component that has a critical influence on the type of attitudes that students report in the survey. According to the criteria of how the IGs and DLGs work, solidarity, interaction, and sharing seem to explain why these children develop positive learning attitudes. However, it would be interesting to continue with this line of research to see if this outcome also presents when other educational actions are used in which the principals of action are different (when they are centered on the individual, for example).

Finally, evidence seems to support the statement that the successful academic performance of children who participate in IGs and DLGs is explained by the fact that participating in these two types of SEAs transforms the children’s context to a positive orientation towards learning. Indeed, the results are hopeful. However, we need to further replicate this study to confirm (or refute) that statement. In any case, confirming that statement and covering the preceding research questions presents the clear implication that teachers have to put their effort into designing their lessons, as how they organize their classes truly encourages students’ learning.

Supporting information

S1 file. sam questionnaire: what i think about learning in school..

https://doi.org/10.1371/journal.pone.0240292.s001

S2 File. Dataset.

https://doi.org/10.1371/journal.pone.0240292.s002

  • View Article
  • PubMed/NCBI
  • Google Scholar
  • 5. Rosenthal R, Jacobson L. Pygmalion in the classroom: Teacher Expectation and Pupils' Intellectual Development. Holt, Rinehart & Winston: New York; 1968.
  • 6. Mead GH. Mind, self and society. University of Chicago Press.: Chicago; 1934.
  • 7. Flecha R. Successful educational actions for inclusion and social cohesion in Europe. Springer: Cham; 2014 Nov 14.
  • 8. Willis PE. Learning to labor: How working class kids get working class jobs. Columbia University Press; 1981.
  • 9. Bruner J. The culture of education. Harvard University Press; 1996.
  • 18. Renaud RD. Attitudes and Dispositions. International Guide to Student Achievement. In: Hattie J, Anderman EM. International guide to student achievement. New York, London: Routledge 2013. P. 57–58.
  • 25. Vygotsky LS. Mind in society: The development of higher psychological processes. Harvard University Press; 1978.
  • 27. Lave J, Wenger E. Situated learning: Legitimate peripheral participation. Cambridge university press; 1991 Sep 27.
  • 28. Wenger E. Communities of practice: Learning, meaning, and identity. Cambridge university press; 1999 Sep 28.
  • 37. Habermas J. The theory of communicative action. Boston: Beacon. 1984.
  • 39. Soler-Gallant M. Learning through dialogue: toward an interdisciplinary approach to dialogic learning in adult education (Doctoral dissertation, Harvard Graduate School of Education).
  • 41. Bakhtin MM. The dialogic imagination: Four essays. University of Texas Press: Austin; 1981.
  • 47. Freire P. A la sombra de este árbol. Barcelona: El Roure; 1997.
  • 49. Galton M., Comber C. & Pell T. ‘The consequences of transfer for pupils: attitudes and attainment’. In Hargreaves L. & Galton M. (Eds.) Transfer from the primary classroom 20 years on. London: Routledge Falmer. 2002;131–158.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of plosone

Transforming students’ attitudes towards learning through the use of successful educational actions

Javier díez-palomar.

1 Department of Linguistic and Literary Education, and Teaching and Learning of Experimental Sciences and Mathematics, University of Barcelona, Barcelona, Spain

Rocío García-Carrión

2 Faculty of Psychology and Education, University of Deusto, Ikerbasque, Basque Foundation for Science, Bilbao, Spain

Linda Hargreaves

3 Faculty of Education, University of Cambridge, Cambridge, United Kingdom

María Vieites

4 CREA–Community of Research of Excellence for All, University of Barcelona, Barcelona, Spain

Associated Data

All relevant data are within the manuscript and its Supporting Information files.

Previous research shows that there is a correlation between attitudes and academic achievement. In this article, we analyze for the first time the impact of interactive groups (IG) and dialogic literary gatherings (DLG) on the attitudes that students show towards learning. A quantitative approach has been performed using attitude tests validated by previous research. The data suggest that in both cases, the participants show positive attitudes. The social context has an important influence on students’ attitudes. The items with higher correlations include group work, mutual support, and distributed cognition. In the case of IGs, group work is much more appreciated, while in the case of DLGs, self-image and self-confidence are the two most clearly valued attitudes. The positive impact of IGs and DLGs on students’ attitudes may have potential for teachers in transforming their practices and decision-making within the classroom.

Introduction

In this article, we address the following research question: What impact does participation in interactive groups (IGs) and dialogic literary gatherings (DLGs) have on the attitudes that students show towards learning? To define “attitudes”, we draw on the definition of Harmon-Jones, Harmon-Jones, Amodio and Gable [ 1 ], who characterize attitudes as “subjective evaluations that range from good to bad that are represented in memory” [ 1 ]. This definition is also consistent with the classic definitions about “attitudes” used in social psychology studies [ 2 ].

Previous research suggests that there is a clear relationship between students’ attitudes and their academic achievement [ 3 , 4 ]. Decades ago, classic studies in the field of educational research [ 5 ] concluded that teachers’ expectations about students’ attitudes and behaviors may explain students’ effective academic achievement. According to [ 4 ], the process of the “social construction of identity” explains why there are students who seem destined to obtain poor academic results. Drawing on the theoretical approach developed by Mead [ 6 ], Molina and her colleagues [ 4 ] argue that the way in which a student defines his/her own identity determines his/her own learning expectations and, as a consequence, his/her own academic career. In this sense, the process of constructing identity is social in essence.

According to Mead [ 6 ], the self emerges as a result of social interaction with other people who project their expectations and attitudes on the individual. The identity of a person is formed by two components. The first component is the me , which is of social origin and incorporates the attitudes of others about the individual; the second component is the I , which is the conscious reaction of each individual to those attitudes. A process is thus created in which identity is the result of the dialogue between the individual and others. This somehow explains why some students end up developing an identity as bad students, while others develop an identity as good students. This process has been called the “Pygmalion effect” by educational researchers [ 5 ]. As Flecha [ 7 ] suggests, drawing on successful educational actions (SEAs), teachers get their students to achieve better results, and that, in turn, explains why these students improve their self-concept as students (i.e., their me and I , in Mead’s terms). However, does that mean that they also change their attitudes towards learning in school?

Classic studies such as Learning to Work [ 8 ] suggest that students with low academic performance tend to be children who reject school. These students tend to manifest that feeling of rejection in wayward attitudes. These children also do not see school as a desirable or attractive place. In contrast, they show an attitude of rejection and resistance towards schooling, which is accompanied by low academic performance. Later researchers, such as Bruner, have suggested that this attitude is the result of the failure of the schools to respond to the expectations of these children (and their families) [ 9 ]. Students’ identities are defined in other spaces and with other references. This may have a negative impact on the ability of teachers to teach. Various studies have suggested that as some of these children grow, their interest in school diminishes. This happens especially in the transition between primary and secondary school, as some students lose interest in science, mathematics and other subjects. Given this situation, researchers have found that learning initiatives located outside of the school can change these attitudes towards learning, as in the case of visiting museums, laboratories, or research centers [ 10 ].

This article discusses the impact of participating in two educational activities previously defined as successful educational actions (SEAs) [ 7 ] on attitudes towards learning shown by students who have participated in these actions. Thus far, we have clear evidence of the positive impact that SEAs have on learning outcomes [ 11 – 15 ], and there are studies suggesting that there is also a positive effect on the coexistence and cohesion of the group-class [ 16 , 17 ]. However, previous studies have not explored the impact that such SEAs may have on students’ own attitudes. Therefore, this article discusses this dimension of learning, which, as the studies mentioned above claim, is a relevant aspect to understanding how learning works.

Theoretical framework

Attitudes and learning.

There is an assumed belief in education that there is a direct relationship between student attitudes and academic achievement. Renaud [ 18 ] distinguishes between dispositions and attitudes by stating that the former is more “resistant” to change than the latter. Dispositions are defined as “more general and enduring characteristics”, while attitudes are tendencies or internal states of the person towards anything that a person can evaluate, such as “learning math, extracurricular activities, or the general notion of going to school” [ 18 ]. Previous research that exists on attitudes and learning has found that there is a clear relationship between both aspects. Renaud [ 18 ] quotes literature reviews that indicate that there is a correlation between attitude and achievement in mathematics [ 19 – 21 ] and in science [ 22 ]. According to previous research, the relationship becomes stronger at higher educational levels [ 22 – 24 ].

Ma and Kishor [ 19 ] analyze three indicators that refer to attitude to evaluate their impact on academic achievement: the self-concept about mathematics, the support of the family and the gender role in mathematics. According to their data, the most important correlation corresponds to the self-concept (p. 24). Masgoret and Gardner [ 23 ] found that motivation is more closely related to academic achievement than attitude. Attitude, on the other hand, seems to be related to achievement; however, it is related indirectly through motivation. Motivation and self-concept present a clear relationship (a correlation exists), but the research is not conclusive in regard to which direction the relationship works, i.e., we do not know (yet) if it is the motivation that gives rise to the person’s positive self-concept or if having a positive self-concept translates into an increase in terms of motivation. In any case, both variables seem to correlate directly with academic achievement; higher motivation and self-concept are associated with better academic results (in general terms).

The “symbolic interactionism” approach

The theoretical approach that has devoted the most effort to analyzing the relationship between attitudes, motivation, self-concept and academic achievement is that of symbolic interactionism. George H. Mead [ 6 ] is one of the best-known representatives of this theory. According to his findings, self-concept is of social origin. “The self is something which has a development; it is not initially there at birth” [ 6 ]. To explain this process, Mead proposes the concept of the generalized other. According to him, the generalized other is “the organized community or social group which gives to the individual his or her unity of self” [ 6 ]. Mead illustrates how this concept works to create the self-concept by drawing on the game metaphor. He uses the example of baseball. The baseball team is what Mead calls the generalized other. Each player has a specific role within the team, and he or she acts in accordance with what is expected of him or her in that position. The rest of the team does the same, so that individual actions are defined and carried out within the more global unit that defined is the team (i.e., the generalized other). Using this example and others, Mead [ 6 ] was able to show that the self is the result of a social process. Similarly, Vygotsky [ 25 ] claimed that higher psychological functions emerge through interpersonal connections and actions with the social environment until they are internalized by the individual.

Mead states as follows:

It is in the form of the generalized other that the social process influences the behavior of the individuals involved in it and carrying it on, i.e., that the community exercises control over the conduct of its individual members; for it is in this form that the social process or community enters as a determining factor into the individual’s thinking. [ 6 ]

This social process involves interaction with other individuals in the group through shared activities. The classroom is the perfect example of a group. The teacher and the students are part of a social group with defined norms [ 26 ] as well as an institutional objective (teaching and learning), where each “player” performs a specific role according to those (declared or implicit) norms.

In some investigations, this social unit has been defined as a “community of practice” [ 27 , 28 ]. Brousseau [ 29 ] uses the concept of “contracte didactique” to characterize this social unit and analyze its functioning in the mathematics classroom. According to Brousseau, there is a relationship between the different actors (individuals) participating in the mathematics classroom, in which each plays a specific role and has specific responsibilities. The teacher has the obligation to create sufficient conditions for the appropriation of knowledge by the students and must be able to recognize when this happens. Similarly, the responsibility of the students is to satisfy the conditions created by the teacher. Brousseau studies how the teacher performs what he calls the didactic transposition of scientific knowledge to be taught in school. S/he has to identify the epistemological obstacles and the cognitive obstacles that make it difficult or students to learn in the classroom.

However, other studies have suggested that there are factors of another nature (neither cognitive nor epistemological) that also influence the academic achievements reached by students. This is the case with interactions [ 16 , 30 , 31 ]. As Mead [ 6 ] suggested, the self-concept created by an individual is the result of the internalization of the expectations that each individual has of himself or herself by the role s/he plays in the group to which s/he belongs. For example, the student who always tries hard and answers the teacher’s questions is fulfilling his or her role within the good student group. The group expects him or her to play that role. It is part of his/her identity. In addition, s/he acts accordingly. The effect of the positive or negative projection of expectations on students has been widely studied in education [ 32 , 33 ]. What we know is that teachers have to be cautious and try not to project negative expectations on students because that has a clear effect on their academic achievements, giving rise to well-studied interactions such as the Pygmalion effect, or the self-fulfilling prophecy [ 5 ].

However, the impact of successful educational actions [ 7 ] on the attitudes that students have towards learning at school in the context of interactive groups and dialogic literary gatherings has not been studied so far. This impact is what is discussed in this article.

The successful educational actions of interactive groups and dialogic literary gatherings

The research question discussed in this article is framed in the context of the implementation of two successful educational actions identified by the European Commission in the research project titled INCLUD-ED : Strategies for inclusion and social cohesion from education in Europe [ 34 ]. A successful educational action is defined as an action carried out in the school, the result of which significantly improves students’ learning [ 7 ]. The two successful educational actions that are discussed herein are interactive groups and dialogic literary gatherings.

Interactive groups

Interactive groups (IGs) consist of a particular group-based teaching practice in which students are put together in small groups of approximately six or seven students, with an adult person facilitating the task. IGs must be heterogeneous in terms of their composition, including children with different ability levels, gender, socioeconomic background, etc. The adult person (the facilitator) is a volunteer who encourages dialogic interaction among the group members while performing the task designed by the teacher. The teacher splits the students among four or five IGs (depending on the number of children in the classroom and the time available for the lesson). Each group of students has a task assigned, which the teachers have previously designed. There is a total of four or five tasks (the same number as the number of IGs). The assignments are about the subject that is being focused on in the lesson plan (i.e., mathematics, language, science, history, etc.). To perform the assigned task, groups have fifteen or twenty minutes of time (depending on the total time allocated for that activity in the school day). After this interval, the teacher asks the students to move to the next IG, where they will find another task. When the class is over, each of the children must have gone through all of the tasks. All of the children perform four or five different tasks designed by the teacher to cover the curriculum requirements. In some schools, the kids move from one IG to the next. In other cases, the teachers prefer to ask the facilitators to move between the groups to avoid the noise and disorder created by the children getting up and moving to the next table (task).

The facilitators never provide solutions to the tasks executed by the students. Instead, they encourage students to share, justify, explain, their work to their group mates. Their responsibility is encouraging students to use dialogic talk [ 15 ], which is based on the principles of dialogic learning [ 35 ]. Research evidence on IGs suggests that using dialogic talk increases participants’ chances of improving their academic achievements [ 15 , 30 , 36 ]. When students are asked by the facilitator to justify their answers to a task, they need to conceptually defend their claims; this implies that they must be able to not only understand the concept or concepts embedded in the assigned tasks but also explain them to their group mates. The type of talk (speech) that appears when children engage in this type of interaction has been defined as dialogic talk [ 30 ] because it is oriented towards validity claims [ 37 ], not towards the power position that children occupy within the group.

Dialogic literary gatherings

Dialogic literary gatherings (DLGs) are spaces in which students sit in a circle and share the reading of a classic literary book. The gathering is facilitated by the teacher, whose role is not to intervene or give his/her opinion, but to organize the students’ participation by assigning them turns. Every child who wants to share his/her reading raises his/her hand and waits until the teacher gives him/her a turn. Readings come from classic literature, such as works by Shakespeare, Cervantes, Kafka, Tagore, etc. [ 38 , 39 ]. Students read at home the assigned number of pages (it either could be a whole chapter or a certain number of pages, according to the teacher’s criteria). When reading the assignment, the student highlights a paragraph and writes down the reason for his/her choice. Then, during the DLG session in school, the children bring the paragraph or paragraphs they want to share with the rest of their classmates. At the beginning of the session, the teacher asks who wants to share his/her paragraph. S/he writes down the name of the students offering to share on a list. Then, the teacher starts with the first name on the list and that student reads aloud his/her paragraph; s/he also identifies on which page of the text it is so that the rest of the participants in the DLG can follow the reading and explains the reason for his/her choice. After the reading, the teacher opens the floor for questions. S/he always prioritizes the children who participate less often. When the teacher considers that the topic has been sufficiently commented on either because the idea that led to the intervention has been fully commented on or because the children's questions drifted to other irrelevant topics, s/he moves to the next name on the list. The process is repeated until the session ends.

Children, when talking about their paragraph, become involved in a process called “dialogical reading” [ 40 ], which is based on the application of Bakhtin’s concept of “dialogism” [ 41 ]. Bakhtin explains this concept using the idea of “polyphony” to refer to the use of multiple voices in a narrative work, such as the case of Dostoevsky’s poetics [ 41 ]. According to Bakhtin, no voice is the result of a single speech but rather it integrates different voices. This concept has been reused and reinterpreted in educational research. Drawing on those authors, knowledge is the result of internalizing the voices of multiple people (teachers, family members, friends, classmates, and other people) that we have encountered throughout our lives. DLGs recreate that multiplicity of voices through the dialogues that generate a space in which all children contribute with their opinions, ideas, and understandings about the paragraph being discussed. In this sense, reading understanding develops in a much deeper way than if the child had to read the material individually because s/he can incorporate the points of view of his/her peers into his/her own final comprehension.

Methodology

The data used to discuss the research question come from a research project titled SEAs4All–Schools as Learning Communities in Europe . The dataset has been submitted to this journal as supporting data for public use. Six schools from the four European countries of Cyprus, United Kingdom, Italy and Spain participated in this project. Five of them were primary schools, and the last one was a middle/high school. All of the schools were selected because they applied successful educational actions (SEAs) [ 7 ]. After implementing IGs and DLGs, a survey was conducted in three of the schools to evaluate the impact of using these two types SEAs on students’ attitudes and perceptions towards learning. Children between 7 and 11 years old participated in the survey. Two of the surveyed schools are located in the United Kingdom, and the third one is located in Italy. All of the schools are located in different contexts. One of the schools in the United Kingdom is in an area where families have a high economic status and high cultural capital (Cambridge), while the other English school is located in a neighborhood considered to be of a medium-level SES (Norwich). The Italian school is located in a low SES area of Naples. A total of 418 children participated in the survey (251 participating in DLGs and 167 engaging in IGs), as shown in Table 1 .

To collect the data, the SAM questionnaire, developed at the Universities of Leicester and Cambridge, UK, was used as a model for the evaluation of the impact of the implemented educational actions. The original SAM questionnaire consists of 17 items that are measured using a 5-point Likert scale, which ranges between “strongly agree” and “strongly disagree.” The questionnaire used in the current study was amended by drawing on previous results from a pilot test and was thus reduced to 12 items [ S1 File ].

The children answered a paper version of the questionnaire. The data were then coded and entered into an Excel matrix that was later used to analyze the data in SPSS (version 25.0). To debug the database and detect possible errors in the transcription, univariate descriptive analysis was conducted using the table of frequencies for each item to check that all codes and weights were aligned with the data collected through the paper questionnaires. Whenever an anomaly was detected, we proceeded to review the original questionnaire on paper to verify the information and data transcribed in the matrix.

To analyze the data, a descriptive report was first made by tabulating the data in frequency tables using the mean, median and mode, as well as the variance and standard deviation.

To answer the research question, a main components analysis was used because we do not know a priori if there is any explanatory factor structure that relates the IGs and DLGs with students’ attitudes. To test the validity of using such analysis, Bartlett’s sphericity test ( Χ 2 ) was used to check the homoscedasticity of the three schools participating in the survey as follows:

where k = 3 represents each of the samples of the three schools whose students participated in the survey. Bartlett’s test was applied both to the case of the subsample of students participating in the DLGs and to the case of those participating in the IGs. The results of this test are shown in Table 2 .

Before performing Bartlett’s test, four of the items were recoded (#1, #3, #4, and #6) since the grading vector of the Likert scale used in these three items went in the opposite direction as that used for the rest of the items. These four items were displayed in a negative tone (i.e., “we learn best when the teacher tells us what to do”, “learning through discussion in class is confusing”, “sometimes, learning in school is boring”, and “I would rather think for myself than hear other people’s ideas”), unlike the rest of the items in which the tone of the answers was positive. Therefore, the labels of “strongly agree”, “agree a little”, “not sure”, “disagree a little”, and “strongly disagree” for items #1, #3, #4, and #6 referenced to a scale with a negative associated vector, whereas for the rest of the items, the same labels refer to a positive vector. For this reason, the responses of these four variables were recoded into four new variables that reversed the original direction of the response vector.

In both cases, (IGs and DLGs), Bartlett’s test suggests that we can accept the null hypothesis, which means that we can use factor analysis to discriminate which principal components are the ones that explain the greatest percentage of the variance. The results of this analysis are discussed below.

Ethic statement

The studies involving human participants were reviewed and approved by Ethics Committee of the Community of Research on Excellence for All, University of Barcelona. Schools collected the families’ informed consent approving the participation of their children in this study.

The Ethics Board was composed by: Dr. Marta Soler (president), who has expertise within the evaluation of projects from the European Framework Program of Research of the European Union and of European projects in the area of ethics; Dr. Teresa Sordé, who has expertise within the evaluation of projects from the European Framework Program of Research and is a researcher of Roma studies; Dr. Patricia Melgar, a founding member of the Catalan Platform against gender violence and researcher within the area of gender and gender violence; Dr. Sandra Racionero, a former secretary and member of the Ethics Board at Loyola University Andalusia (2016–2018); Dr. Cristina Pulido, an expert in data protection policies and child protection in research and communication and researcher of communication studies; Dr. Oriol Rios, a founding member of the “Men in Dialogue” association, a researcher within the area of masculinities, as well as an editor of “Masculinities and Social Change,” a journal indexed in WoS and Scopus; and Dr. Esther Oliver, who has expertise within the evaluation of projects from the European Framework Program of Research and is a researcher within the area of gender violence.

Students’ attitudes towards learning

The data collected suggest that the students who participate in the IGs and the DLGs have positive attitudes towards learning in general terms. Table 3 indicates that the answers for almost all the items are clearly positive; this is true for between 75% and 80% of the responses, except for three items (#3, #4, and #6). This outcome is understandable since in these three items, the interviewer changed the meaning of the question, i.e., instead of phrasing the questions positively, as with the rest of the items, the questions were phrased negatively, with the expected outcome being that the positive trend in the answers would be reversed, which is what occurred. Surprisingly, in the case of item #1, which we expected to function similar to items #3, #4 and #6, the responses are aligned with the rest of the items.

The answers to item 1 (“We learn best when the teacher tells us what to do”) may indicate an active role by the teacher, which a priori would not be the expected answer in the context of using IGs and DLGs. In contrast, what we would expect in that context is for students to show a preference for answers related to an active role of the student, which is the case for the rest of the items analyzed. However, the fact that the respondents also claim that they learn better when the teacher tells them what to do either suggests that there is a bias in the student responses that is either due to what Yackel and Cobb [ 26 ] call the “norms”, which are also theorized as the “didactic contract” in Brousseau’s terms [ 29 ] and which regulate the social interactions within the classroom, or because the role of the teacher as a leader is recognized by these students.

Table 4 summarizes the previous results in two categories (agree and disagree). The trend noted above can now be clearly seen.

Items #4 and #10 are crucial to understanding the attitudes that students have towards learning. In the first case, half of the respondents contrarily claim that learning is a boring activity. If we assume that for a boy or a girl between 7 and 11 years old, defining an activity as fun or boring can be a clear way of indicating their attitude towards that activity, the fact that half of the students participating in the survey declare that learning is not boring suggests that their participation in doing mathematics in the IGs and DLGs makes these two activities in some way attractive to them.

On the other hand, another relevant aspect regarding the students’ attitudes is the feeling of self-confidence. Previous research has provided much evidence that suggests the importance of this aspect in the attitudes that children can have towards learning [ 5 ]. Boys and girls who have confidence in themselves tend to show a clearly positive attitude towards learning. The data suggest that this is what happens when students participate within IGs or DLGs, i.e., three out of four children affirm that they feel more self-confident with regard to learning in school than they normally do. This result is relevant because it suggests that both IGs and DLGs have a clear impact on the positive transformation of attitudes towards learning. The data show that this is true for children in the three schools that participated in the survey, regardless of the country or the context in which they are located.

Principal components analysis

The KMO test indicates whether the partial correlations between the variables are small enough to be able to perform a factorial analysis. Table 2 shows that in this case, the KMO test has a value of 0.517 for the students participating in IGs and 0.610 for the students participating in the DLGs, which allows us to assume (although with reservation) that we can perform a factor analysis to find the principal components explaining the variance. Bartlett’s sphericity test (which contrasts the null hypothesis assuming that the correlation matrix is, in fact, an identity matrix, in which case we cannot assume that there are significant correlations between the variables) yields a critical value of 0.000 in both cases, which suggests that we can accept the null hypothesis of sphericity and, consequently, that we can think that the factorial model is adequate to explain de data.

After performing ANOVA several times, considering the several items in the tested models, we managed to find two models (one for the students who had participated in the IGs and another for those engaged in the DLGs) that explained more than half of the variance. Tables ​ Tables5 5 and ​ and6 6 introduce the obtained results.

Extraction method: Principal component analysis a

a The only cases used are those in which DLG or IG = IG in the analysis phase

a The only cases used are those in which DLG or IG = DLG in the analysis phase

As Tables ​ Tables5 5 and ​ and6 6 indicate, the items that are included in the SAM test contribute to better explaining the attitudes that the students participating in the study have regarding DLGs than those they have regarding the IGs. For the DLGs, we observed that there are four components above 1, explaining 74.227% of the variance. In contrast, in the case of IGs, we find only two components above the value of 1, which together explain only 63.201% of the variance. This suggests that the SAM test is probably the best instrument to measure attitudes in the case of the DLGs.

The sedimentation graphs make it easier to visualize this result. In the left image of Fig 1 (the sedimentation graph obtained for the IGs), a clear inflection is observed from component 2. In contrast, in the case of the DLGs, the inflection occurs from component four and onward.

An external file that holds a picture, illustration, etc.
Object name is pone.0240292.g001.jpg

The matrix of components suggests that, in the case of the IGs, factor 1 is formed by the components that we can label as “active peer-support” (#9 “Helping my friends has helped me to understand things better”) and “active listening” (#8 “It is good to hear other people’s ideas”). Factor 2, on the other hand, is formed by the component of “participation” (#2 “We can learn more when we can express our own ideas”). For the IGs, the component “individualism” (#6 “I would rather think for myself than hear other people’s ideas”) is clearly the least explanatory (-0.616), which is a fact that seems to suggest that collaboration within the groups is a fundamental aspect of the learning dynamic occurring within them. Table 7 shows that the most explanatory factor of the variance is the first factor. On the other hand, factors 2, 3 and 4 are less important since their weights are almost irrelevant.

Extraction method: Principal component analysis a,b

a 2 components extracted

b The only cases used are those for which DLG or IG = IG in the analysis phase

When we observe the results for the case of the DLGs, the component matrix ( Table 8 ) indicates that factor 1 is formed mainly by the components of “positive discussion” (#11 “I like discussing the books we read with the class”), “self-confidence in school” (#10 “I am more confident about learning in school than I used to be), and “participation” (#2 “We can learn more when we can express our own ideas”). In contrast, Factor 2 contains a single component (#1 “We learn best when the teacher tells us what to do”). In the case of factor 3, the more explanatory component is the sixth component (#6 “I would rather think for myself than hear other people’s ideas”). Finally, factor 4 includes the third component of the SAM test (#3 “Learning through discussing in class is confusing”).

a 4 components extracted

b The only cases used are those for which DLG or IG = DLG in the analysis phase

Fig 2 shows the graphs of the loading scores for each component in a rotated space, both for the IGs and the DLGs. The data confirm the interpretation of the previous tables. The graphs show that for the IGs (the left side of Fig 2 ), variables #8 and #9 tend to explain the maximum variance of factor 1, while in the case of DLGs, the three-dimensional component chart shows the two slightly differentiated groups of variables.

An external file that holds a picture, illustration, etc.
Object name is pone.0240292.g002.jpg

Construction of subscales of attitudes towards learning in IGs and DLGs

The collected data allow us to think that the variables obtained with the SAM test may explain the attitudes that students have towards learning in the context of IGs and DLGs. However, according to previous theoretical models, it would seem plausible to assume that not all variables are equally precise in the explanation of the attitudes towards learning showed by the students interviewed in both contexts. For this reason, in this section, we compare two possible scales for each context (IGs and DLGs) to identify which variables would be more reliable in explaining those attitudes.

In the case of the IGs, we created two subscales. The first subscale (Tables ​ (Tables9 9 – 11 ) includes items #1 (“We learn best when the teacher tells us what to do”), #7 (“I enjoy learning when my friends help me”) and #10 (“I am more confident about learning in school than I used to be”). In contrast, subscale 2 (Tables ​ (Tables12 12 – 14 ) incorporates items #2 (“We learn more when we can express our own ideas”), #5 (“Learning in school is better when we have other adults to work with us”) and #11 (“I like discussing the books we read with the class”). Table 9 shows the Cronbach’s alpha value for subscale 1, which is rather mediocre (0.412), while Table 12 indicates that subscale 2 is a much more reliable subscale (Cronbach’s alpha of 0.794), suggesting that the subscale 2 works better than the first one to characterize the components explaining the results obtained within the IGs. The difference between the two subscales is that in the first one, the role of the teacher is not included, while in subscale 2, item #5 (“Learning in school is better when we have other adults to work with us”) is the one that presents the highest correlation (0.613), as seen in Table 13 , which shows the interitem correlation matrix for the variables of subscale 2.

Regarding DLGs, we also created two subscales, i.e., subscales 3 (Tables ​ (Tables15 15 – 17 ) and 4 (Tables ​ (Tables18 18 – 20 ). Subscale 3 is formed by variables #1 (“We learn best when the teacher tells us what to do”), #5 (“Learning in school is better when we have other adults to work with us”), #10 (“I am more confident about learning in school than I used to be”) and #11 (‘I like discussing the books we read with the class”). In contrast, subscale 4 includes variables #2 (“We learn more when we can express our own ideas”), #7 (“I enjoy learning when my friends help me”), #8 (“It is good to hear other people’s ideas”) and #9 (“Helping my friends has helped me to understand things better”).

Table 15 shows the results for subscale 3 (DLGs), including a high Cronbach’s alpha value (0.820), indicating that this subscale is a good proposal. According to data shown in Tables ​ Tables16 16 and ​ and17, 17 , the subscale 3 works better when component #10 is removed from the model (increasing Cronbach’s alpha from 0,820 to 0,829). This result suggests that self-confidence is not a relevant component for attitudes towards participating in DLGs. Subscale 3 is better than subscale 4, which presents a Cronbach’s alpha value of 0.767, which, although high, is lower than that found in subscale 3.

The clearest difference between both subscales (3 and 4) is that the first one (subscale 3) includes item #11 (“I like discussing the books we read with the class”), which is the one item more focused on the context of the DLGs. The other items are more related to the interaction among the students (Tables ​ (Tables18 18 – 20 ). However, the amount of correlation explained is lower than the amount explained by subscale 3. This fact may explain why subscale 3 is more reliable in measuring students’ attitudes towards learning in social/interactional contexts, such as DLGs.

Discussion and conclusions

Previous research in education has provided enough evidence to claim that attitudes have a relevant impact on learning [ 42 – 46 ]. Studies such as those of Fennema and Sherman [ 3 ] have confirmed almost half a century ago that the “affective factors (…) partially explain individual differences in the learning of mathematics” [ 3 ]. Currently, we know that the results of learning depend, to a certain extent, on the attitudes that students have towards it. When there is a clear resistance to school and school practices, it is more difficult for students to achieve good results. Aspects such as motivation or self-concept, for instance, are relevant to explaining a positive attitude towards learning. These aspects often appear to be correlated [ 19 ]. When a child has a poor self-concept as a student, s/he often feels very unmotivated to learn in school. The literature reports numerous cases of students who actively or passively resist or even refuse to make an effort to learn their lessons because they felt that they cannot learn anything. In contrast, when students’ self-image is positive, then it is easier for them to learn. In those cases, the data provide evidence of positive correlations between learning achievements and attitudes. Rosenthal and Jacobson [ 5 ] called this type of behavior the Pygmalion Effect.

SEAs [ 7 ] such as IGs and DLGs are framed within the dialogic learning theory, one of whose main principals is that of transformation. As Freire [ 47 ] claimed, people “are beings of transformation, and not of adaptation.” Education has the capacity to create opportunities for people to transform themselves. Drawing on the assumption that “education needs both technical, scientific and professional training, as well as dreams and utopia” [ 47 ], SEAs integrate practices endorsed by the international scientific community to create real opportunities for learning for children. The data presented in the previous section suggest that children participating either in IGs or DLG have a clear positive attitude towards learning. Table 4 suggests that these children truly enjoy learning. Less than half of the participants say that learning at school is “boring” (41.8%). In contrast, almost eight out of ten children interviewed said they love learning (78.6%). The SAM test items, validated in previous studies [ 10 , 48 , 49 ], have been confirmed as components with which to measure children’s attitudes towards learning. For the first time in the context of studying the impact of actions included within the SEAs [ 7 ], we have been able to identify (and measure) the positive relationship between implementing SEAs, i.e., boys and girls engaged in the IGs and/or the DLGs showed a clear positive attitude towards learning. We can therefore claim that in the context of SEAs, students show positive attitudes towards learning. The data analyzed suggest that participating within IGs and DLGs empower students to transform their own attitudes towards learning.

On the other hand, we know that social contexts have a powerful influence on people’s attitudes. The context of positive empowerment, based on the idea of “maximum expectations” [ 50 , 51 ], is able to transform the attitudes that students have towards learning (especially those who are more resistant to learning and school). In contexts where school and school practices are not valued, children have to overcome the social tendency to openly show resistance against school (and everything that represents the school, such as teachers, attitudes of compliance with the school activities, norms, etc. ) and embrace a new tendency of valuing all these aspects. However, as previous studies framed within the symbolic interactionism approach have largely demonstrated [ 6 , 8 ], it is hard to turn against the social pressure of the group. We define our identity as a result of our interactions with others. If the group finds it attractive to resisting schooling, school norms and practices, then it is going to be difficult for individual students to achieve good academic results (unless they receive a different context from elsewhere) because they have to fight against the social pressure of not valuing school, in addition to the inherent difficulties of learning itself (in cognitive and curricular terms). In contrast, when the context is transformed (to adopt the terms of Freire and Flecha) and learning becomes a valued practice, children usually transform their attitudes, which previous research has correlated with successful learning achievement [ 14 , 15 , 33 ]. The data collected and discussed herein provide evidence for how changing the context (drawing on the two SEAs of IGs and DLGs) can transform students’ attitudes towards learning. As we stated in the previous section, 78.6% of the students participating in the survey claimed that they like to learn after participating in either IGs or in DLGs. They claim that they like “when my friends help me.” Along the same lines, 78.3% of the respondents said that “it is good to hear other people’s ideas,” while 76.7% claim that “helping my friends has helped me to understand things better.” This type of answer clearly demonstrates that IGs and DLGs create a context in which learning is valued positively. Attitudes such as solidarity, willingness to help the other, friendship seem to indicate the preference for a context that is oriented towards learning rather than resisting it. Hence, transforming the context also changes how individuals recreate their own identities using different values as referents, which, drawing on Mead [ 6 ], is how identity creation works. The evidence collected herein suggests that IGs and DLGs work to increase students’ academic performance because they transform the students’ context; hence, students transform their own attitudes (as expected according the theory of symbolic interactionism).

By analyzing more in detail what happens in both the IGs and the DLGs, we have been able to verify that the attitudes that emerge among the students participating either in the IGs or the DLGs are slightly different. In the case of the IGs, the data collected reveal that children value much more the collaborative work with the rest of their classmates, as seen in the answers to items #8 (“It is good to hear other people’s ideas”) and #9 (“Helping my friends has helped me to understand things better”). These two items are the main components explained by the variance detected. On the other hand, in the case of the DLGs, the ability to express one’s ideas is especially valued. In this case, the variance is explained above all by items #2 (“We learn more when we can express our own ideas”), #10 (“I am more confident about learning in school than I used to be”) and #11 (“I like discussing the books we read with the class”). The last component (#11) clearly belongs to a context similar to that of the DLGs. However, the two previous components (#2 and #10) suggest that participation in DLGs is related to the development of a positive self-image as learner. The chi-square test indicates that the correlations are significant in both cases. Therefore, the data obtained suggest that participating in IGs and/or DLGs is related to showing positive attitudes towards learning (both as an individual and as member of the group, i.e., in a social sense).

On the other hand, when analyzing the reliability of the results, it can be verified that in the case of the IGs, the most important correlation appears in the case of item #5 (“Learning in school is better when we have other adults to work with us”). This finding is very relevant since it constitutes empirical evidence of something that Vygotsky already suggested when he proposed his concept of ZPD, which is that in order for the process to work, there must be an adult or a more capable peer to help those who are learning achieve what they can with the support of these adults who act as facilitators. The difference between IGs and other collaborative learning groups is exactly that, i.e., in the IGs, there is always an adult who dynamizes the activity (who does not provide the answers but encourages the children to engage in a dialogic interaction [ 15 ]).

Regarding the DLGs, the most important correlation appears in the case of item #11 (“I like discussing the books we read with the class”), which is an aspect that makes sense in the context of the gatherings. Children affirm that they like to read books together with their other classmates. As we know, this activity has clear advantages from the point of view of the development of reading understanding [ 41 , 52 ].

A surprising finding is the high response rate to item #1 (“We learn best when the teacher tells us what to do”). This would seem to be inconsistent with using IGs or DLGs, where the role of the teacher is rather marginal or passive (the teacher organizes the activity but does not give answers, and they explain the academic content such as in a master’s class, etc.). Perhaps a possible reason to explain this result is that the school, as an institution, is characterized by a series of social norms [ 26 ]. Waiting for teachers’ directions is part of those norms. It is assumed that when attending the school, we must pay attention to what the teacher says. This idea corresponds to the social image of the teacher as a transmitter of knowledge, which is part of the social norm characterizing the school institution. It is possible that even though the children participating in this study have engaged in IGs and DLGs, there are not excluded from the norms of the social context, so that their attitudes are tinged with them.

We can therefore conclude that the SAM test demonstrates that children who participate in IGs and/or DLGs clearly show positive attitudes towards learning after participating in these two SEAs. Perhaps this is one of the fundamental variables explaining the successful learning results that other studies have found among children using SEAs [ 11 – 15 ].

Future implications

This research confirms some aspects of learning, while it leaves others open for further study. We have observed that children who participate in SEAs show positive attitudes towards learning. However, what we do not know (yet) is whether it is the use of these SEAs that explains why these children show these attitudes or if the transformation lies in other reasons. To clarify this lack of information, it is necessary to conduct further experimental research comparing groups of students using SEAs and other groups of students using other types of educational actions.

On the other hand, the data that we have discussed herein suggest that there is a social component that has a critical influence on the type of attitudes that students report in the survey. According to the criteria of how the IGs and DLGs work, solidarity, interaction, and sharing seem to explain why these children develop positive learning attitudes. However, it would be interesting to continue with this line of research to see if this outcome also presents when other educational actions are used in which the principals of action are different (when they are centered on the individual, for example).

Finally, evidence seems to support the statement that the successful academic performance of children who participate in IGs and DLGs is explained by the fact that participating in these two types of SEAs transforms the children’s context to a positive orientation towards learning. Indeed, the results are hopeful. However, we need to further replicate this study to confirm (or refute) that statement. In any case, confirming that statement and covering the preceding research questions presents the clear implication that teachers have to put their effort into designing their lessons, as how they organize their classes truly encourages students’ learning.

Supporting information

Funding statement.

JDP, RGC, LH and MVC want to acknowledge the funding provided by the EU Commission under the grant num. 2015-1-ES01-KA201-016327, corresponding to the project Schools as Learning Communities in Europe: Successful Educational Actions for all, (SEAS4ALL), under the program ERASMUS +; and the Spanish Ramón y Cajal Grant RYC-2016-20967 for open access publication of the article.

Data Availability

  • PLoS One. 2020; 15(10): e0240292.

Decision Letter 0

30 Jul 2020

PONE-D-20-08828

Transforming Students’ Attitudes Towards Learning Through the Use of Successful Educational Actions

Dear Dr. Diez-Palomar,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript.

That revision is truly a minor one: according to the PLOS ONE data reporting policy, authors are required to indicate where the study data may be found. Therefore, please add to your indication that “the data used in this article comes (sic) from the research project SEAs4All – Schools as Learning Communities in Europe ” the URLs, accession numbers or DOIs if your data are held or will be held in a public repository. If that does not apply but you are able to provide details of access elsewhere, with or without limitations, please do so. For details, please see the journal’s data reporting guidelines at https://journals.plos.org/plosone/s/submission-guidelines#loc-data-reporting .

Please submit your revised manuscript by Sep 13 2020 11:59PM. If you will need more time than this to complete your revision, please reply to this message or contact the journal office at gro.solp@enosolp . When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter. You should upload this letter as a separate file labeled 'Response to Reviewers'.
  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.
  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see:  http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Christian Stamov Roßnagel

Academic Editor

Journal requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2.In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability .

Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories . Any potentially identifying patient information must be fully anonymized.

Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions . Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access.

We will update your Data Availability statement to reflect the information you provide in your cover letter.

3. We note you have included a table to which you do not refer in the text of your manuscript. Please ensure that you refer to Table 1, 8, 10, 11, 14, 16, 17, 18, 19 and 20 in your text; if accepted, production will need this reference to link the reader to the Table.

4. Your ethics statement must appear in the Methods section of your manuscript. If your ethics statement is written in any section besides the Methods, please move it to the Methods section and delete it from any other section. Please also ensure that your ethics statement is included in your manuscript, as the ethics section of your online submission will not be published alongside your manuscript.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

Reviewer #2: Yes

Reviewer #3: Yes

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: I Don't Know

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: (No Response)

Reviewer #2: The paper has a sound theoretical introduction later discussed and enriched with empirical quantitative data. It is remarkable that the authors clearly explain what was already known about the two educational actions explored and which is the specific contribution of this article. Data is collected in three different schools (in UK and Italy), whose contexts are sufficiently explained. The detailed description of the methodology and results is one of the strengths of the paper.

The authors confirm that all data underlying the findings described in their manuscript are fully available without restriction, but they should specify where the data can be found.

Reviewer #3: This research supposes an advance for the educative field, it also enables the improvement of concrete practical education. In order to know about the impact of interactive groups (IG) and dialogic literary gatherings (DLG) in relation to attitudes and academic achievement, this research is original and unpublished.

The research also shows to be well documented, both theoretically and with the latest research carried out. The article has consistency in all sections, which provides coherence and reliability of the study.

6. PLOS authors have the option to publish the peer review history of their article ( what does this mean? ). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy .

Reviewer #1: No

Reviewer #2: No

Reviewer #3: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool,  https://pacev2.apexcovantage.com/ . PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at  gro.solp@serugif . Please note that Supporting Information files do not need this step.

Author response to Decision Letter 0

16 Sep 2020

Dear Editor,

Here we attach the article “Transforming Students’ Attitudes Towards Learning Through the Use of Successful Educational Actions” for your consideration, responding to requests that you sent us by July 30th, 2020.

In your message, you requested us to include / check / review the following aspects:

(1) “Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at:

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf”

We have edited the whole article, according to the PLOS ONE style requirements.

(2) “In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability .

We will update your Data Availability statement to reflect the information you provide in your cover letter.”

We added the following sentence in the methodological section: “The dataset has been submitted to this journal as supporting data for public use.” In addition, we are attaching a file named S2 File 2 containing the dataset in SPSS format.

(3) “We note you have included a table to which you do not refer in the text of your manuscript. Please ensure that you refer to Table 1, 8, 10, 11, 14, 16, 17, 18, 19 and 20 in your text; if accepted, production will need this reference to link the reader to the Table.”

All tables have been reviewed and referred along the main text of the article. Additional / further explanations have been added when needed:

• (line 397-398) Table 7 shows that the most explanatory factor of the variance is the first factor. On the other hand, factors 2, 3 and 4 are less important since their weights are almost irrelevant.

• (line 437-438) suggesting that the subscale 2 works better than the first one to characterize the components explaining the results obtained within the IGs.

• (line 459-461) According to data show in Tables 16 and 17, the subscale 3 works better when component #10 is removed from the model (increasing Cronbach’s alpha from 0,820 to 0,829). This result suggest that self-confidence is not a relevant component for attitudes towards participating in DLGs.

(4) “Your ethics statement must appear in the Methods section of your manuscript. If your ethics statement is written in any section besides the Methods, please move it to the Methods section and delete it from any other section. Please also ensure that your ethics statement is included in your manuscript, as the ethics section of your online submission will not be published alongside your manuscript.”

The ethics statement has been placed in a section besides the methodology, right after that.

We would like to thank you very much all the comments that have improved significantly the quality of this article.

Javier Díez-Palomar, Rocío García-Carrión, Linda Hargreaves and María Vieites

Submitted filename: rebuttal letter_2.docx

Decision Letter 1

24 Sep 2020

PONE-D-20-08828R1

Thank you for your careful revision of your manuscript.

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/ , click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at gro.solp@gnillibrohtua .

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact gro.solp@sserpeno .

Additional Editor Comments (optional):

Acceptance letter

28 Sep 2020

Dear Dr. Díez-Palomar:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact gro.solp@sserpeno .

If we can help with anything else, please email us at gro.solp@enosolp .

Thank you for submitting your work to PLOS ONE and supporting open access.

PLOS ONE Editorial Office Staff

on behalf of

Mr Christian Stamov Roßnagel

Change Password

Your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

  • Sign in / Register

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

Best Practices for Measuring Students’ Attitudes toward Learning Science

  • Matthew Lovelace
  • Peggy Brickman

*Department of Educational Psychology, University of Georgia, Athens, GA 30602

Search for more papers by this author

Address correspondence to: Peggy Brickman ( E-mail Address: [email protected] ).

Department of Plant Biology, University of Georgia, Athens, GA 30602

Science educators often characterize the degree to which tests measure different facets of college students’ learning, such as knowing, applying, and problem solving. A casual survey of scholarship of teaching and learning research studies reveals that many educators also measure how students’ attitudes influence their learning. Students’ science attitudes refer to their positive or negative feelings and predispositions to learn science. Science educators use attitude measures, in conjunction with learning measures, to inform the conclusions they draw about the efficacy of their instructional interventions. The measurement of students’ attitudes poses similar but distinct challenges as compared with measurement of learning, such as determining validity and reliability of instruments and selecting appropriate methods for conducting statistical analyses. In this review, we will describe techniques commonly used to quantify students’ attitudes toward science. We will also discuss best practices for the analysis and interpretation of attitude data.

Science, technology, engineering, and math (STEM) education has received renewed interest, investment, and scrutiny over the past few years ( American Association for the Advancement of Science [AAAS], 2010 ; President's Council of Advisors on Science and Technology, 2012 ). In fiscal year 2010 alone, the U.S. government funded 209 STEM education programs costing more than $3.4 billion ( National Science and Technology Council, 2011 ). At the college level, education researchers have predominantly focused greater effort on demonstrating the results of classroom interventions on students’ intellectual development rather than on their development of “habits of mind, values and attitudes” toward learning science ( National Research Council, 2012 ). However, students’ perceptions of courses and attitudes toward learning play a significant role in retention and enrollment ( Seymour and Hewitt, 1997 ; Gasiewski et al ., 2012 ). Motivation has a strong direct effect on achievement ( Glynn et al ., 2007 ), and, in some courses, students’ attitudes may provide a better predictor of success than quantitative ability ( Steiner and Sullivan, 1984 ).

The current national effort to comprehensively adopt active-learning strategies in college classrooms ( Handelsman et al . 2004 ; Wood and Handelsman, 2004 ; AAAS, 2010 ) provides additional reasons to assess students’ attitudes. Although use of active-learning strategies has repeatedly demonstrated impressive gains in student achievement ( Michael, 2006 ; Freeman et al ., 2007 , 2011 ; Armbruster et al ., 2009 ; Haak et al ., 2011 ), these gains may be strongly tied to changes in learning orientation (at least for problem-based methods; Cruce et al ., 2006 ). Additionally, researchers have characterized significant levels of student resistance ( Powell, 2003 ; Yerushalmi et al ., 2007 ; White et al ., 2010 ) and discomfort with the ambiguity, lack of a “right” response, and multiplicity of views found in these methods ( Cossom, 1991 ). For all these reasons, many researchers have increased their focus on measuring students’ engagement, perceived learning gains, motivation, attitudes, or self-efficacy toward learning science.

There are a wide variety of excellent tools available to gather data on student perceptions. Qualitative analysis tools, such as student interviews, provide rich data that can reveal new insights and allow for flexibility and clarification of students’ ideas ( Slater et al ., 2011 ). However, analyzing written comments or transcripts can be very labor intensive. Quantitative analysis tools, such as survey instruments, can allow for easier compilation of student responses that attach numerical scores to students’ opinions about different aspects of a curriculum along a continuum, say, 1–5, with 1 being “not useful” to 5 being “very useful” for each aspect. The familiar end-of-semester student evaluations of courses and teachers use a combination of quantitative survey items and qualitative open-ended comments. In addition, the Student Assessment of Their Learning Gains Internet site alone has almost 7800 instructors creating surveys that query students’ perceptions of gains in learning ( www.salgsite.org ). To draw the most valid conclusions possible from data collected through such tools, it is important for faculty to choose analyses most appropriate for the task. This review is designed to present an overview of some of the common assessment tools available to measure students’ attitudes toward learning science. The review will also provide widely endorsed, straightforward recommendations for analysis methods with theory and empirical evidence to support analysis plans. Our goal is to help education researchers plan attitudinal studies such that they avoid common pitfalls. We would also like to provide advice and references for supporting your approaches to analyzing and displaying attitudinal data.

INVENTORIES (SCALES) FOR ASSESSING STUDENTS’ ATTITUDES

Pen-and-paper assessments used to gauge psychological characteristics such as attitude are commonly referred to interchangeably as inventories, surveys, instruments, or measurement scales. Psychologists use such tools to assess phenomena of interest, such as beliefs, motivation, emotions, and perceptions that are theoretical constructs not directly observable and often composed of multiple facets. The more psychologists know about the theoretical underpinnings of a construct, the more likely they are to develop reliable, valid, and useful scales ( DeVellis, 2003 ). Psychological constructs are often described as latent , meaning they are not directly observed but are instead inferred from direct measurements of theoretically related variables ( Lord and Novick, 1968 ; Borsboom et al ., 2003 ). The most important methodological concern to stress about scales designed to measure a latent construct is that they are not solely a collection of questions of interest to the researcher. Instead, scales are composed of items that have been subjected to tests of validity to show that they can serve as reasonable proxies for the underlying construct they represent ( DeVellis, 2003 ). Bond and Fox (2007) use the history of the development of temperature measures as an analogy for better understanding measurement theory in the social sciences. Although people customarily refer to the reading on a thermometer as “the temperature,” Bond and Fox explain that a thermometer reading is at best an indirect measure. The estimate of temperature is indirect, because it is determined from the known effects of thermal energy on another variable, such as the expansion of mercury or the change in conductivity of an electrical resistor. Similarly, in the social sciences, numerical representations of psychological attributes (e.g., attitude toward science) are derived from theoretical explanations of their effect on a more readily observable behavior (e.g., response to a set of survey items). In this way, an attitudinal scale score serves as a proxy for the latent construct it is purported to measure, and researchers need to be prepared to defend the validity and limitations of their scale in representing it ( Clark and Watson, 1995 ).

Just as one would not consider a single question adequate to evaluate a student's knowledge about a biology topic, one would not evaluate a complex construct, for example, engagement, with a single item. A scale developed to evaluate engagement would undergo a rigorous, iterative validation process meant to determine the aspects of the underlying construct the scale represents and empirically test the hypothesized relationships between the construct and its observable proxy ( Clark and Watson, 1995 ). A measurement scale is composed of a collection of purposely constructed items backed up by empirical evidence of interrelationship and evidence that they represent the underlying construct ( Carifio and Perla, 2007 ). A minimum of six to eight items is recommended to provide for adequate considerations of generalizability, reliability, and validity ( Cronbach et al ., 1972 ). Table 1 lists scales for assessing attitudes in college-level biology students who have met standard criteria set for common tests of validity and reliability.

The basic assumption behind attitude scales is that it is possible to uncover a person's internal state of beliefs, motivation, or perceptions by asking them to respond to a series of statements ( Fraenkel and Wallen, 1996 ). Individuals indicate their preference through their degree of agreement with statements on the scale. Items containing these statements are constructed with three common response formats: dichotomous agree/disagree, semantic-differential, and Likert formats ( Crocker and Algina, 2008 ). In all cases, the items consist of two parts: a question stem and a response option ( Figure 1 ). Dichotomous items contain just two response options (1 = yes, 2 = no; or 0 = disagree, 1 = agree) following a simple declarative statement. Semantic-differential items use a bipolar adjective (opposite-meaning) list or pair of descriptive statements that examinees use to select the response option out of a range of values that best matches their agreement. These semantic-differential items measure connotations. ( Figure 1 contains semantic-differential items from Lopatto [2004].) As demonstrated in Table 1 , Likert items are the most common response formats used in attitude scales. They offer multiple response categories that usually span a 5-point range of responses, for example, A = “strongly agree” to E = “strongly disagree,” but may span any range. ( Figure 1 contains Likert response–format items from Russell and Hollander [1975] and Seymour et al . [2000] .) Generally, internal-consistency reliability is increased and sufficient variances obtained when more than four response options are used ( Masters, 1974 ; Comrey, 1988 ). In addition to the increase in reliability when moving from the dichotomous 2-point range to a 4- or 5-point range, statisticians have demonstrated an increase in type II error rates in 2-point response formats ( Cohen, 1983 ). Response options may be delineated by numbers, percentages, or degrees of agreement and disagreement. Response options may also be structured in several equivalent ways: a numbering system, letters to indicate the responses, or just end points indicated ( Frisbie and Brandenburg, 1979 ; Lam and Klockars, 1982 ).

Figure 1.

Figure 1. Common inventory items for assessing attitude. The three most common types of items used in attitude inventories or scales include: dichotomous, semantic-differential, and Likert-type items. All three formats consist of a question stem followed by several response options. Each of these three types differ in the number and types of response options. Dichotomous items contain just two response options, while semantic-differential and Likert-type items are polytomous. Semantic-differential items use a bipolar adjective list or pair of descriptive statements that examinees use to select a response option out of a range of values that best matches their agreement. Likert-type items include a declarative statement followed by several levels of agreement along a span of (usually) five to seven response options. Semantic-differential items from Lopatto (2004). Likert response–format items from Russell and Hollander (1975) and Seymour et al. (2000).

TYPES OF DATA COLLECTED IN ATTITUDINAL SURVEYS

Psychologist Stanley Smith Stevens is credited with developing the theory of data types that are pertinent for pen-and-paper tests used to measure psychological constructs ( Stevens, 1946 ). He set forward “basic empirical operations” and “permissible statistics” for the four levels of measurement scales, terms, and rules he developed to describe the properties of different kinds of data: nominal, ordinal, interval, or ratio ( Table 2 ). Data collected in a nominal format describe qualitative traits, categories with no inherent order, such as demographic information like nationality or college major. Responses to dichotomous items are considered nominal when 0 and 1 merely serve as descriptive tags, for example, to indicate whether someone is male or female ( Bond and Fox, 2007 ). However, dichotomous items may be used to generate ordinal rather than nominal data. For example, the disagree/agree or unsatisfied/satisfied responses to dichotomous items generate data for which a value of 1 represents a meaningfully greater value than that represented by 0. Ordinal data are nominal data with an added piece of quantitative information, a meaningful order of the qualities being measured. This means these data can be rank-ordered (first, second, third, …). In addition to these agree/disagree dichotomous items, responses to semantic-differential items ask participants to place themselves in order along a continuum between two adjectives. Likert items ask participants to rank a set of objects or statements with response options over a range of values: “strongly disagree,” “disagree,” “neutral,” “agree,” and “strongly agree.” These would also commonly be described as ordinal, because the response choices on a particular item are arranged in rank order of, in this case, least amount of agreement to most ( Jamieson, 2004 ; Carifio and Perla, 2007 ; Norman, 2010 ). Both nominal and ordinal data are described as categorical, whereas the two other levels of measurement—interval and ratio—are quantitative ( Agresti, 2007 ). Quantitative data can be further classified as discrete quantitative, only being able to take on certain values, or as continuous quantitative, theoretically able to take on any value within a range ( Steinberg, 2011 ).

CATEGORICAL (NONPARAMETRIC) VERSUS QUANTITATIVE (PARAMETRIC) DATA ANALYSIS PROCEDURES

In inferential statistics, tests are conducted to determine the plausibility that data taken from a smaller random sample are representative of the parameters measured were the data to be observed for the entire population ( Moore, 2010 ). (See Table 3 for a glossary of statistical terms.) Researchers commonly refer to the statistical tools developed for analyzing categorical data with a nominal or ordinal dependent variable as nonparametric and include the median test; Mann-Whitney U -test; Kruskal-Wallis one-way analysis of variance (ANOVA) of ranks; and Wilcoxon matched-pairs, signed-ranks test ( Huck, 2012 ). These tests involve fewer assumptions than do the parametric test procedures developed for use with quantitative interval- or ratio-level data (such as the assumptions of normality of the distributions of the means and homogeneity of variance that underlie the t and F tests). Parametric statistics are so named because they require an estimation of at least one parameter, assume that the samples being compared are drawn from a population that is normally distributed, and are designed for situations in which the dependent variable is at least interval ( Stevens, 1946 ; Gardner, 1975 ). Researchers often have a strong incentive to choose parametric over nonparametric tests, because parametric approaches can provide additional power to detect statistical relationships that genuinely exist ( Field, 2009 ). In other words, for data that do meet parametric assumptions, a nonparametric approach would likely require a larger sample to arrive at the same statistical conclusions. Although some parametric techniques have been shown to be quite robust to violations of distribution assumptions and inequality of variance ( Glass et al ., 1972 ), researchers will sometimes convert raw scores into ranks to utilize nonparametric techniques when these assumptions have been violated and also when sample sizes are small ( Huck, 2012 ).

The assumption that parametric tests should only be used with interval-level dependent variables is central to the ongoing debate about appropriate analyses for attitudinal data. Statistics such as mean and variance—what are commonly the parameters of interest—are only truly valid when data have meaningfully equidistant basic units; otherwise, using these statistics is “in error to the extent that the successive intervals on the scale are unequal in size” ( Stevens, 1946, p. 679 ). Data from well-designed psychological measurement scales, however, can have properties that appear more interval than ordinal in quality, making classification based on Stevens’ guidelines more ambiguous ( Steinberg, 2011 ). This has led to a great deal of conflicting recommendations over whether to use parametric or nonparametric data analysis procedures for scales based on ordinal data from dichotomous and semantic-differential items, but particularly for Likert-type items ( Knapp, 1990 ; Carifio and Perla, 2008 ). For example, some sources argue that assigning evenly spaced numbers to ordinal Likert-response categories creates a quantitative representation of the response options that is more interval than ordinal and, therefore, practically speaking, could be analyzed as interval quantitative data. This argument supports computing means and SDs for Likert-response items ( Fraenkel and Wallen, 1996 ) and utilizing parametric statistical analysis techniques (e.g., ANOVA, regression) designed for interval data ( Norman, 2010 ). Others argue that equivalency of distances between ranked responses in a Likert-response format should not be assumed and, thus, treating responses to a Likert item as ordinal would lead to a more meaningful interpretation of results ( Kuzon et al ., 1996 ; Jamieson, 2004 ; Gardner and Martin, 2007 ). Stevens (1946) even offered this pragmatic suggestion: “In the strictest propriety, the ordinary statistics involving means and SDs ought not to be used with these [ordinal] scales, for these statistics imply a knowledge of something more than the relative rank-order of data. On the other hand, for this ‘illegal’ statisticizing there can be invoked a kind of pragmatic sanction: In numerous instances it leads to fruitful results” (p. 679).

The reasoning behind varying perspectives on appropriate procedures for analysis of data involving ordinal items has been addressed in further detail elsewhere ( Harwell and Gatti, 2001 ; Carifio and Perla, 2007 , 2008 ; Norman, 2010 ). Marcus-Roberts and Roberts (1987) sum it up best by saying that although it may be “appropriate” to calculate means, medians, or other descriptive statistics to analyze ordinal or ranked data, the key point is “whether or not it is appropriate to make certain statements using these statistics” (p. 386). The decision to analyze ordinal responses as interval quantitative data depends heavily on the purpose of the analysis. In most cases, ordinal-response measurement scales are used to gather data that will allow inferences to be made about unobservable constructs. To simply accept the data as interval would be to ignore both the subjectivity of these opinion-type questions and the response format the numbers represent. The decision clearly needs to first take into account how the sample investigated can be analyzed to infer characteristics about the population as a whole. The sample in this case includes: 1) the individuals surveyed and 2) the number and nature of the questions asked and how they represent the underlying construct. In the following section, we will provide recommendations for analyzing ordinal data for the three most common response formats used in attitudinal surveys. We will argue that, for semantic-differential and Likert-type items, the question of which analysis to perform hinges on the validity of making conclusions from a single item versus a scale (instrument subjected to tests of validity to support representation of an underlying construct).

RECOMMENDED STATISTICAL ANALYSES FOR ATTITUDINAL DATA

Dichotomous items.

There are a variety of statistical test procedures designed for nonparametric data that are strictly nominal in nature. We will focus on providing recommendations for analysis of dichotomous items producing ordinal data because these are most common in attitudinal surveys. For an excellent overview and treatment of comparisons of many different types of categorical data, we recommend reading the chapter “Inferences on Percentages and Frequencies” in Huck (2012, Chapter 17, pp. 404–433) and in Agresi (2007, Chapters 1–4, pp. 1–120). Let us consider a hypothetical research question: Imagine that a researcher wishes to compare two independent samples of students who have been surveyed with respect to dichotomous items (e.g., items that ask students to indicate whether they were satisfied or unsatisfied with different aspects of a curriculum). If the researcher wishes to compare the percentage of the students in one group, who found the curriculum satisfying, with the percentage of students in the second group, who did not, he or she could use Fisher's exact test, which is used for nonparametric data, often with small sample sizes, or an independent-samples chi-square test, which is used for parametric data from a larger sample size ( Huck, 2012 ). The independent-samples chi-square test has the added benefit of being useful for more than two samples and for multiple categories of responses ( Huck, 2012 ). This would be useful in the scenario in which a researcher wished to know whether the frequency of satisfaction differed between students with different demographic characteristics, such as gender or ethnicity. If the researcher wished to further examine the relationship between two or more categorical variables, a chi-square test of independence could be used ( Huck, 2012 ).

Semantic-Differential and Likert Items

As described in Table 2 , a semantic-differential or Likert item on its own is most likely ordinal, but a composite score from a measurement scale made up of the sum of a set of a set of interrelated items can take on properties that appear much more continuous than categorical, especially as response options, items, and sample size increase ( Carifio and Perla, 2007 ). For these reasons, many researchers use parametric statistical analysis techniques for summed survey responses, in which they describe central tendency using means and SDs and utilize t tests, ANOVA, and regression analyses ( Steinberg, 2011 ). Still, taking on qualities that appear more continuous than ordinal is not inherently accompanied by interval data properties. A familiar example may better illustrate this point. Consider a course test composed of 50 items, all of which were written to assess knowledge of a particular unit in a course. Each item is scored as either right (1) or wrong (0). Total scores on the test are calculated by summing items scored correct, yielding a possible range of 0–50. After administering the test, the instructor receives complaints from students that the test was gender biased, involving examples that, on average, males would be more familiar with than females. The instructor decides to test for evidence of this by first using a one-way ANOVA to assess whether there is a statistically significant difference between genders in the average number of items correct. As long as the focus is superficial (on the number of items correct, not on a more abstract concept, such as knowledge of the unit), these total scores are technically interval. In this instance, a one-unit difference in total score means the same thing (one test item correct) wherever it occurs along the spectrum of possible scores. As long as other assumptions of the test were reasonable for the data (i.e., independence of observations, normality, homogeneity of variance; Field, 2009 ), this would be a highly suitable approach.

But the test was not developed to blindly assess number of items correct; it was meant to allow inferences to be made about a student's level of knowledge of the course unit, a latent construct. Let us say that the instructor believes this construct is continuous, normally distributed, and essentially measuring one single trait (unidimensional). The instructor did his or her best to write test items representing an adequate sample of the total content covered in the unit and to include items that ranged in difficulty level, so a wide range of knowledge could be demonstrated. Knowing this would increase confidence that, say, a student who earned a 40 knew more than a student who earned a 20. But how much more? What if the difference in the two scores were much smaller, for example, 2 points, with the lower score this time being a 38? Surely, it is possible that the student with the 40 answered all of the easier items correctly, but missed really difficult questions, whereas the student with the 38 missed a few easy ones but got more difficult questions correct. Further, would a point difference between two very high scores (e.g., between 45 and 50) mean the same amount of knowledge difference as it would for the same difference between two midrange scores (e.g., 22 and a 27)? To make such claims would be to assume a one-to-one correspondence between a one-unit change in items correct and a one-unit change in knowledge. As Bond and Fox (2007) point out, “scales to which we routinely ascribe that measurement status in the human sciences are merely presumed … almost never tested empirically” (p. 4).

The above example illustrates how data with interval qualities can emerge from nominal/ordinal data when items are combined into total scores, but that the assumption of interval properties breaks down when, without further evidence of a one-to-one correspondence, we use the observed total score to indirectly measure a latent construct, such as knowledge or attitude toward a course. As a solution to this problem, many in the measurement field point to item-based psychometric theory, such as Rasch modeling and item response theory (IRT), techniques that allow ordinal data to be rescaled to an interval metric ( Harwell and Gatti, 2001 ). This is accomplished by using the response data for each item of large samples of respondents as empirical evidence to assess and calibrate the mathematical measurement model of their instrument ( Ostini and Nering, 2006 ; Bond and Fox, 2007 ). In short, item response approaches do not assume equal contributions across items to measuring a construct, but instead assume that the probability of a particular response to an item—such as choosing “strongly agree” for a statement related to having a positive attitude toward learning science—is a function of item parameters (e.g., how endorsable the item is) and person parameters (i.e., how much of the latent trait the person possesses). (See Ostini and Nering, 2006 .) Once these parameters are reasonably estimated, the measurement model for the instrument allows the researcher to estimate a new respondent's location along the latent trait being measured by using an interval continuous scale ( Bond and Fox, 2007 ). A comprehensive treatment of the work involved in developing an IRT-based measure is beyond the scope of this article, but we recommend the article “Rescaling Ordinal Data to Interval Data in Educational Research” by Harwell and Gatti (2001) for an accessible account and examples of how IRT can be used to rescale ordinal data. We also recommend the book Applying the Rasch Model: Fundamental Measurement in the Human Sciences by Bond and Fox (2007) , which provides a context-rich overview of an item-based approach to Likert survey construction and assessment.

So, to summarize, for surveys that contain either semantic-differential and Likert-type items, decisions about analysis begin by first determining whether you are analyzing data from a single item or from a scale composed of validated interrelated items (ideally with IRT item characteristics curves to determine the probability of a particular response to an item as a function of the underlying latent trait). Figure 2 presents a decision matrix based on this initial step and offering recommended descriptive statistics and appropriate tests of association in each case.

Figure 2.

Figure 2. Best practice flowchart. This flowchart can help with decisions that you make while planning your study. It diagrams appropriate approaches to represent and analyze your data once you are in the analysis stage.

LIKERT DATA ANALYSIS EXAMPLE FROM BIOLOGY EDUCATION

In a study published in a science education research journal, the authors gave a survey of attitudes, concepts, and skills to students in a science research program. Students were surveyed pre-, mid-, and postprogram. The survey consisted of Likert-style items (coded 1–5). Students surveyed were engaged in either a traditional model program or a collaborative model program.

Likert Scale Analysis

In the article, the authors tested the internal reliability (using Cronbach's α) of each set of items (attitudes, concepts, and skills) within the survey to see whether it would be reasonable to analyze each set of items as three separate scales. They wanted to exclude the possibility that all items correlated equally well together, thus indicating they perhaps described a unidimensional, single, latent trait. Also, if the items did not correlate together as predicted, the authors would not have had evidence supporting the validity of the items comprising a scale and should not then sum them to create scores for each scale. The researchers set a criterion that each scale had to meet an α of 0.70 or greater for this to be an appropriate procedure. Scale scores were then analyzed as the dependent variable in separate repeated-measures ANOVAs with gender, ethnicity, and treatment group as between-subject factors.

What Is Defendable about This Approach?

The authors checked the reliability of their Likert item sets prior to summing them for analysis. Analyzing a Likert scale (i.e., sum of Likert items), as opposed to single Likert items, likely increased the reliability of the outcome variable. Providing an estimate of the internal consistency of each Likert scale increased confidence that items on each scale were measuring something similar.

What Might Improve This Approach?

The authors reported the internal consistency (a form of reliability) for each of the three scales and the results of their ANOVAs involving these scales, but no other descriptive information about the data, such as measures of central tendency or dispersion. The authors used ANOVA without providing evidence that the data assumptions of this parametric test were met. Although ANOVA is robust in the face of some violations of basic assumptions, such as normality and homogeneous variances with equal sample sizes ( Glass et al ., 1972 ), describing the data would help the reader to better judge the appropriateness of the analyses. Further, the authors’ use of ANOVA treats the dependent variable as interval, but no argument for doing so or limitation of interpretation was provided. For example, the authors could conduct exploratory factor analysis in addition to computing internal reliability (using Cronbach's α) to provide evidence of the clustering of items together in these three categories. Also, if they had adequate numbers of responses, they could use Rasch modeling (IRT) to determine whether the items were indeed of equal difficulty to suggest interval qualities. (See Table 4 for more sources of information about the specifics of ANOVA and its assumptions.) It is also worth noting that, in psychological measurement, many other aspects of reliability and validity of scales are standard in preliminary validation studies. Evidence of other aspects of the scale's reliability (e.g., split-half, test–retest) and validity (e.g., convergent validity, content validity) would bolster any claims that these scales are reliable (i.e., provide consistent, stable scores) and valid (i.e., measure what they purport to measure). Table 4 also contains resources for further information related to these common issues in measurement theory. If the data were judged to be a poor fit with the assumptions of ANOVA, the authors could have chosen a nonparametric approach instead, such as the Mann-Whitney U -test.

LIKERT-ITEM ANALYSIS

In the same article, the authors targeted several individual Likert items from the scale measuring self-perceptions of science abilities. The student responses to these items were summarized in a table. The authors chose these particular items, because students’ responses were indicative of key differences between two types of educational programs tested. The items were included in the table, along with the proportions of students responding “definitely yes” regarding their perceived ability level for a particular task. The authors then conducted separate Fisher exact tests to tests for differences in proportions within the “definitely yes” categories by time (pre-, mid-, and postcourse) and then by program model.

When analyzing individual Likert items, the authors used a nonparametric test for categorical data (i.e., Fisher's exact test for proportions). As these Likert items were ordinal to the best of the authors’ knowledge, a nonparametric test was the most fitting choice.

The authors transformed the items into dichotomous variables (i.e., 1 = definitely yes; 0 = chose a lesser category) instead of analyzing the entire spectrum of the 5-point response format or collapsing somewhere else along the range of options. There should be substantive reasons for collapsing categories ( Bond and Fox, 2007 ), but the authors did not provide a rationale for this choice. It often makes sense to do so when there is a response choice with very few or no responses. Whatever the author's reason, it should be stated.

RECOMMENDATIONS

Validation of an attitudinal measure can be an expensive and labor-intensive process. If you plan to measure students’ science attitudes during the planning phases of your study, look for measurement instruments that have already been developed and validated to measure the qualities you wish to study. If none exist, we recommend collaborating with a measurement expert to develop and validate your own measure. However, if this is not an option—for instance, if you are working with pre-existing data or you do not have the resources to develop and validate a measure of your own—keep in mind the following ideas when planning your analyses ( Figure 2 ).

Avoid Clustering Questions Together without Supportive Empirical Evidence

In some analyses we have seen, the researcher grouped questions together to form a scale based solely on the researcher's personal perspective of which items seemed to fit well together. Then the average score across these item clusters was presented in a bar graph. The problem with this approach is that items were grouped together to make a scale score based on face validity alone (in this case, the subjective opinion of the researcher). However, no empirical evidence of the items covariance or relationship to some theoretical construct was presented. In other words, we have no empirical evidence that these items measure a single construct. It is possible, but it is not always easy, to predict how well items comprise a unidimensional scale. Without further evidence of validity, however, we simply cannot say either way. Failing to at least include evidence of a scale's internal consistency is likely to be noticed by reviewers with a measurement background.

Report Central Tendency and Dispersion Accordingly with the Data Type

For Likert items (not scales), we recommend summarizing central tendency using the median or the mode, rather than the mean, as these are more meaningful representations for categorical data. To give the reader a sense of the dispersion of responses, provide the percentage of people who responded in each response category on the item. In the case of a well-developed scale, it is more appropriate to compute mean scores to represent central tendency and to report SDs to show dispersion of scores. However, keep in mind the admonitions of those who champion item response approaches to scale development (e.g., Bond and Fox, 2007 ): If your measure is of a latent construct, such as student motivation, but your measure has not been empirically rescaled to allow for an interval interpretation of the data, how reasonable is it to report the mean and SD?

For Scales, Statistical Tests for Continuous Data Such as F and t tests May Be Appropriate, but Proceed with Caution

Researchers are commonly interested in whether variables are associated with each other in data beyond chance findings. Statistical tests that address these questions are commonly referred to as tests of association or, in the case of categorical data, tests of independence . The idea behind a test of independence (e.g., chi-square test) is similar to commonly used parametric tests, such as the t test, because both of these tests assess whether variables are statistically associated with each other. If you are testing for statistical association between variables, we do not recommend analyzing individual Likert items with statistical tests such as t tests, ANOVA, and linear regression, as they are designed for continuous data. Instead, nonparametric methods for ordinal data, such as the median Mann-Whitney U -test, or parametric analyses designed for ordinal data, such as ordered logistic regression ( Agresti, 2007 ), are more appropriate. If you are analyzing a Likert scale, however, common parametric tests are appropriate if the other relevant data assumptions, such as normality, homogeneity of variance, independence of errors, and interval measurement scale, are met. See Glass et al. (1972) for a review of tests of the robustness of ANOVA in the cases of violations of some of these assumptions. Remember, though, just like an F -test in an ANOVA, statistical significance only refers to whether variables are associated with each other. In the same way that Pearson's r or partial eta-squared with continuous data estimate the magnitude and direction of an association (effect size), measures of association for categorical data (e.g., odds ratio, Cramer's V ) should be used in addition to tests of statistical significance.

If you are using statistical methods appropriate for continuous data, gather evidence to increase your confidence that your data are interval, or at least approximately so. First, research the psychological characteristic you are intending to measure. Inquire whether theory and prior research support the idea that this characteristic is a unidimensional continuous trait ( Bond and Fox, 2007 ). Test to see that the data you have collected are normally distributed. If you have developed your own items and scales, provide response options with wordings that model an interval range as much as possible. For example, provide at least five response options, as Likert items with five or more response options have been shown to behave more like continuous variables ( Comrey, 1988 ). If possible, run your analyses with nonparametric techniques and compare your results. If your study will include nonparametric data that may only show small effects, plan from the start for a suitable sample size to have enough statistical power.

Student attitudes impact learning, and measuring attitudes can provide an important contribution to research studies of instructional interventions. However, the conclusions made from instruments that gauge attitudes are only as good as the quality of the measures and the methods used to analyze the data collected. When researchers use scores on attitudinal scales, they must remember these scores serve as a proxy for a latent construct. As such, they must have supporting evidence for their validity. In addition, data assumptions, including the level of measurement, should be carefully considered when choosing a statistical approach. Even though items on these scales may have numbers assigned to each level of agreement, it is not automatically assumed that these numbers represent equally distant units that can provide interval-level data necessary for parametric statistical procedures.

  • Adams WK, Wieman CE ( 2011 ). Development and validation of instruments to measure learning of expert-like thinking . Int J Sci Educ 33 , 1289-1312. Google Scholar
  • Agresti A ( 2007 ). An Introduction to Categorical Data Analysis In: 2nd ed. Hoboken, NJ: Wiley. Google Scholar
  • Aikenhead GS, Ryan AG ( 1992 ). The development of a new instrument: “Views on Science-Technology-Society” (VOSTS) . Sci Educ 76 , 477-491. Google Scholar
  • American Association for the Advancement of Science ( 2010 ). Vision and Change: A Call to Action , Washington, DC. Google Scholar
  • Armbruster P, Patel M, Johnson E, Weiss M ( 2009 ). Active learning and student-centered pedagogy improve student attitudes and performance in introductory biology . CBE Life Sci Educ 8 , 203-213. Link ,  Google Scholar
  • Azen R, Walker CM ( 2011 ). Categorical Data Analysis for the Behavioral and Social Sciences , New York: Taylor & Francis. Google Scholar
  • Baldwin JA, Ebert-May D, Burns DJ ( 1999 ). The development of a college biology self-efficacy instrument for nonmajors . Sci Educ 83 , 397-408. Google Scholar
  • Bond TG, Fox CM ( 2007 ). Applying the Rasch Model: Fundamental Measurement in the Human Sciences , New York: Taylor & Francis. Google Scholar
  • Borsboom D, Mellenbergh GJ, van Heerden J ( 2003 ). The theoretical status of latent variables . Psychol Rev 110 , 203-219. Medline ,  Google Scholar
  • Carifio J, Perla R ( 2008 ). Resolving the 50-year debate around using and misusing Likert scales . Med Educ 42 , 1150-1152. Medline ,  Google Scholar
  • Carifio J, Perla RJ ( 2007 ). Ten common misunderstandings, misconceptions, persistent myths and urban legends about Likert scales and Likert response formats and their antidotes . J Social Sci 3 , 106-116. Google Scholar
  • Chen S ( 2006 ). Development of an instrument to assess views on nature of science and attitudes toward teaching science . Sci Educ 90 , 803-819. Google Scholar
  • Clark LA, Watson D ( 1995 ). Constructing validity: basic issues in objective scale development . Psychol Assess 7 , 309-319. Google Scholar
  • Cohen J ( 1983 ). The cost of dichotomization . Appl Psychol Measure 7 , 249-253. Google Scholar
  • Comrey AL ( 1988 ). Factor-analytic methods of scale development in personality and clinical psychology . J Consult Clin Psychol 56 , 754-761. Medline ,  Google Scholar
  • Cossom J ( 1991 ). Teaching from cases . J Teach Social Work 5 , 139-155. Google Scholar
  • Crocker L, Algina J ( 2008 ). Introduction to Classical and Modern Test Theory , Mason, OH: Cengage Learning. Google Scholar
  • Cronbach LJ, Gleser GC, Nanda H, Rajaratnam NS ( 1972 ). The Dependability of Behavioral Measurements , New York: Wiley. Google Scholar
  • Cruce TM, Wolniak GC, Seifert TA, Pascarella ET ( 2006 ). Impacts of good practices on cognitive development, learning orientations, and graduate degree plans during the first year of college . J Coll Stud Dev 47 , 365-383. Google Scholar
  • DeVellis RF ( 2003 ). Scale Development Theory and Applications , Thousand Oaks, CA: Sage. Google Scholar
  • Elliot AJ, Church MA ( 1997 ). A hierarchical model of approach and avoidance achievement motivation . J Pers Soc Psychol 72 , 218-232. Google Scholar
  • Field A ( 2009 ). Discovering Statistics Using SPSS , Thousand Oaks, CA: Sage. Google Scholar
  • Finney SJ, Pieper SL, Barron KE ( 2004 ). Examining the psychometric properties of the Achievement Goal Questionnaire in a general academic context . Educ Psychol Meas 64 , 365-382. Google Scholar
  • Fowler FJ ( 1995 ). Improving Survey Questions: Design and Evaluation , Thousand Oaks, CA: Sage. Google Scholar
  • Fraenkel JR, Wallen NE ( 1996 ). How to Design and Evaluate Research in Education , New York: McGraw-Hill. Google Scholar
  • Freeman S, Haak D, Wenderoth AP ( 2011 ). Increased course structure improves performance in introductory biology . CBE Life Sci Educ 10 , 175-186. Link ,  Google Scholar
  • Freeman S, O’Connor E, Parks JW, Cunningham M, Hurley D, Haak D, Dirks C, Wenderoth MP ( 2007 ). Prescribed active learning increases performance in introductory biology . Cell Biol Educ 6 , 132-139. Abstract ,  Google Scholar
  • Frisbie DA, Brandenburg DC ( 1979 ). Equivalence of questionnaire items with varying response formats . J Educ Measure 16 , 43-48. Google Scholar
  • Gardner HJ, Martin MA ( 2007 ). Analyzing ordinal scales in studies of virtual environments: Likert or lump it! . Presence-Teleop Virt 16 , 439-446. Google Scholar
  • Gardner PL ( 1975 ). Scales and statistics . Rev Educ Res 45 , 43-57. Google Scholar
  • Gasiewski JA, Eagan MK, Garcia GA, Hurtado S, Chang MJ ( 2012 ). From gatekeeping to engagement: a multicontextual, mixed method study of student academic engagement in introductory STEM courses . Res High Educ 53 , 229-261. Medline ,  Google Scholar
  • Glass GV, Peckham PD, Sanders JR ( 1972 ). Consequences of failure to meet assumptions underlying fixed effects analyses of variance and covariance . Rev Educ Res 42 , 237-288. Google Scholar
  • Glynn SM, Brickman P, Armstrong N, Taasoobshirazi G ( 2011 ). Science Motivation Questionnaire II: validation with science majors and nonscience majors . J Res Sci Teach 48 , 1159-1176. Google Scholar
  • Glynn SM, Taasoobshirazi G, Brickman P ( 2007 ). Nonscience majors learning science: a theoretical model of motivation . J Res Sci Teach 44 , 1088-1107. Google Scholar
  • Groves RM, Fowler FJ, Jr., Couper MP, Lepkowski JM, Singer E, Tourangeau R ( 2004 ). Survey Methodology , New York: Wiley. Google Scholar
  • Haak DC, HilleRisLambers J, Pitre E, Freeman S ( 2011 ). Increased structure and active learning reduce the achievement gap in introductory biology . Science 332 , 1213-1216. Medline ,  Google Scholar
  • Halloun I, Hestenes D ( 1996 ). Views About Sciences Survey: VASS. Paper presented at the Annual Meeting of the National Association for Research in Science Teaching, St. Louis, MO, March 31‒April 3, 1996 . Google Scholar
  • Handelsman J, Beichner R, Bruns P, Chang A, DeHaan R, Ebert-May D, Gentile J, Lauffer S, Stewart J, Wood WB ( 2004 ). Universities and the teaching of science [response] . Science 306 , 229-230. Medline ,  Google Scholar
  • Handelsman MM, Briggs WL, Sullivan N, Towler A ( 2005 ). A measure of college student course engagement . J Educ Res 98 , 184. Google Scholar
  • Harwell MR, Gatti GG ( 2001 ). Rescaling ordinal data to interval data in educational research . Rev Educ Res 71 , 105-131. Google Scholar
  • Huck SW ( 2012 ). Reading Statistics and Research , Boston: Pearson. Google Scholar
  • Hunter A-B, Laursen SL, Seymour E ( 2007 ). Becoming a scientist: the role of undergraduate research in students’ cognitive, personal, and professional development . Sci Educ 91 , 36-74. Google Scholar
  • Jamieson S ( 2004 ). Likert scales: how to (ab)use them . Med Educ 38 , 1217-1218. Medline ,  Google Scholar
  • Keppel G, Wickens TD ( 2004 ). Design and Analysis: A Researcher's Handbook , 4th ed. Upper Saddle River, NJ: Pearson. Google Scholar
  • Knapp TR ( 1990 ). Treating ordinal scales as interval scales: an attempt to resolve the controversy . Nursing Res 39 , 121-123. Medline ,  Google Scholar
  • Kuzon WM, Urbanchek MG, McCabe S ( 1996 ). The seven deadly sins of statistical analysis . Ann Plast Surg 37 , 265-272. Medline ,  Google Scholar
  • Lam TCM, Klockars AJ ( 1982 ). Anchor point effects on the equivalence of questionnaire items . J Educ Measure 19 , 317-322. Google Scholar
  • Lopatto D ( 2004 ). Survey of Undergraduate Research Experiences (SURE): first findings . Cell Biol Educ 3 , 270-277. Link ,  Google Scholar
  • Lord FM, Novick MR ( 1968 ). Statistical Theories of Mental Test Scores , Reading, MA: Addison-Wesley. Google Scholar
  • Marcus-Roberts HM, Roberts FS ( 1987 ). Meaningless statistics . J Educ Stat 12 , 383-394. Google Scholar
  • Masters JR ( 1974 ). The relationship between number of response categories and reliability of Likert-type questionnaires . J Educ Measure 11 , 49-53. Google Scholar
  • Michael J ( 2006 ). Where's the evidence that active learning works? . Adv Physiol Educ 30 , 159-167. Medline ,  Google Scholar
  • Moore DS ( 2010 ). The Basic Practice of Statistics In: New York: Freeman. Google Scholar
  • National Research Council ( 2012 ). Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering , Washington, DC: National Academies Press. Google Scholar
  • National Science and Technology Council ( 2011 ). The Federal Science, Technology, Engineering, and Mathematics (STEM) Education Portfolio , Washington, DC. Google Scholar
  • Norman G ( 2010 ). Likert scales, levels of measurement and the “laws” of statistics . Adv Health Sci Educ 15 , 625-632. Medline ,  Google Scholar
  • Ostini R, Nering ML ( 2006 ). Polytomous Item Response Theory Models , Thousand Oaks, CA: Sage. Google Scholar
  • Powell K ( 2003 ). Spare me the lecture . Nature 425 , 234-236. Medline ,  Google Scholar
  • Pintrich PR ( 1991 ). A Manual for the Use of the Motivated Strategies for Learning Questionnaire (MSLQ). Report no. ED338122. Ann Arbor In: MI: National Center for Research to Improve Postsecondary Teaching and Learning . Google Scholar
  • Pintrich PR, Smith DAF, Garcia T, Mckeachie WJ ( 1993 ). Reliability and predictive-validity of the Motivated Strategies for Learning Questionnaire (MSLQ) . Educ Psychol Meas 53 , 801-813. Google Scholar
  • President's Council of Advisors on Science and Technology ( 2012 ). Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics , Washington, DC: Executive Office of the President. Google Scholar
  • Presser S, et al. ( 2004 ). Methods for Testing and Evaluating Survey Questionnaires , New York: Wiley. Google Scholar
  • Russell J, Hollander S ( 1975 ). A biology attitude scale . Am Biol Teach 37 , 270-273. Google Scholar
  • Semsar K, Knight JK, Birol G, Smith MK ( 2011 ). The Colorado Learning Attitudes about Science Survey (CLASS) for use in biology . CBE Life Sci Educ 10 , 268-278. Link ,  Google Scholar
  • Seymour E, Hewitt N ( 1997 ). Talking about Leaving: Why Undergraduates Leave the Sciences , Boulder, CO: Westview. Google Scholar
  • Seymour E, Wiese DJ, Hunter A-B, Daffinrud SM ( 2000 ). Creating a better mousetrap: on-line student assessment of their learning gains In: Paper presented at the National Meeting of the American Chemical Society , San Francisco, CA March 26–30, 2000. Google Scholar
  • Slater SJ, Slater TF, Bailey JM ( 2011 ). Discipline-Based Science Education Research: A Scientist's Guide , New York: Freeman. Google Scholar
  • Steinberg WJ ( 2011 ). Statistics Alive! In: Thousand Oaks, CA: Sage. Google Scholar
  • Steiner R, Sullivan J ( 1984 ). Variables correlating with student success in organic chemistry . J Chem Educ 61 , 1072-1074. Google Scholar
  • Stevens SS ( 1946 ). On the theory of scales of measurement . Science 103 , 677-680. Google Scholar
  • Terenzini PT, Cabrera AF, Colbeck CL, Parente JM, Bjorklund SA Collaborative learning vs. lecture/discussion: students' reported learning gains ( 2001 ). J Eng Educ 90 , 123-130. Google Scholar
  • White J, Pinnegar S, Esplin P ( 2010 ). When learning and change collide: examining student claims to have “learned nothing.” . J Gen Educ 59 , 124-140. Google Scholar
  • Wood WB, Handelsman J ( 2004 ). Meeting report: the 2004 National Academies Summer Institute on Undergraduate Education in Biology . Cell Biol Educ 3 , 215-217. Link ,  Google Scholar
  • Yerushalmi E, Henderson C, Heller K, Heller P, Kuo V ( 2007 ). Physics faculty beliefs and values about the teaching and learning of problem solving. I. Mapping the common core . Phys Rev ST Phys Educ Res 3 , 020109. Google Scholar
  • Zimmerman LK ( 1996 ). The development of an Environmental Values Short Form. J Environ Educ . 28 , 32-37. Google Scholar
  • Mali M. Hubert ,
  • Maryrose Weatherton , and
  • Elisabeth E. Schussler
  • Lisa Corwin, Monitoring Editor
  • A latent profile analysis of students’ attitudes towards astronomy across grades 9–13 13 December 2023 | International Journal of Science Education, Vol. 5
  • Jeremy L. Hsu ,
  • Noelle Clark ,
  • Kate Hill , and
  • Melissa Rowland-Goldsmith
  • Luanna Prevost, Monitoring Editor
  • Improving scientific abilities through lab report revision in a high school investigative science learning environment classroom 15 December 2023 | Physical Review Physics Education Research, Vol. 19, No. 2
  • Responsive Portfolio Website Using React
  • Role models’ influence on student interest in and awareness of career opportunities in life sciences 23 February 2023 | International Journal of Science Education, Part B, Vol. 13, No. 4
  • FEM education in undergraduate studies: Industry‐informed research 29 March 2023 | Computer Applications in Engineering Education, Vol. 31, No. 5
  • Assessing the knowledge, attitude, and practice among UTHM community towards scheduled waste management in UTHM IOP Conference Series: Earth and Environmental Science, Vol. 1205, No. 1
  • Measuring a cross-sectional sample of students’ intentions to engage with science and modelling associations according to two theoretical perspectives 19 February 2023 | International Journal of Science Education, Vol. 45, No. 7
  • Diversity of Strategies for Motivation in Learning (DSML)—A New Measure for Measuring Student Academic Motivation 1 April 2023 | Behavioral Sciences, Vol. 13, No. 4
  • Measuring Attitudes: Current Practices in Health Professional Education 20 July 2023
  • Review of factors affecting gender disparity in higher education 18 May 2022 | Cogent Social Sciences, Vol. 8, No. 1
  • Cameron A. Hecht ,
  • Anita G. Latham ,
  • Ruth E. Buskirk ,
  • Debra R. Hansen , and
  • David S. Yeager
  • Erin L. Dolan, Monitoring Editor
  • Sustainable Agriculture: Relationship between Knowledge and Attitude among University Students 22 November 2022 | Sustainability, Vol. 14, No. 23
  • Development of an Instrument Based on Salient Behavioral Beliefs to Measure Attitude towards Physical Education 30 November 2022 | Physical Education Theory and Methodology, Vol. 22, No. 3s
  • Assessing how students value learning communication skills in an undergraduate anatomy and physiology course 3 December 2021 | Anatomical Sciences Education, Vol. 15, No. 6
  • Project-based learning through sensor characterization in a musical acoustics course 26 September 2022 | The Journal of the Acoustical Society of America, Vol. 152, No. 3
  • The Plant Science Blogging Project: A curriculum to develop student science communication skills 1 July 2022 | PLANTS, PEOPLE, PLANET, Vol. 4, No. 5
  • Percepción estudiantil sobre el uso de metodologías no tradicionales en la enseñanza de la ingeniería 25 July 2022 | DYNA, Vol. 89, No. 222
  • Thomas P. Clements ,
  • Katherine L. Friedman ,
  • Heather J. Johnson ,
  • Cole J. Meier ,
  • Jessica Watkins ,
  • Amanda J. Brockman , and
  • Cynthia J. Brame
  • Kyle Frantz, Monitoring Editor
  • A Personality Types Research Study Based on Personal Values in an Ethics Course for the Engineering and Computer Sc. Undergraduates
  • Classroom Attendance Scale Development and Validation Study 22 February 2022 | European Journal of Science and Technology
  • Reasoning About Technical Drawing Online Teaching During COVID-19 1 December 2021
  • Statistical Analysis of KMM Program—An Educational Intervention 14 February 2022
  • International Journal of Science and Mathematics Education, Vol. 20, No. 6
  • International Journal of Technology and Design Education, Vol. 32, No. 3
  • Fleet Management Practice and Implication for Fleet Performance International Journal of Strategic Decision Sciences, Vol. 13, No. 1
  • Science students' perspectives on how to decrease the stigma of failure 13 December 2021 | FEBS Open Bio, Vol. 12, No. 1
  • Measurement of empathy among health professionals during Syrian crisis using the Syrian empathy scale 29 July 2021 | BMC Medical Education, Vol. 21, No. 1
  • Development and psychometric properties of surveys to assess provider perspectives on the barriers and facilitators of effective care transitions 20 May 2021 | BMC Health Services Research, Vol. 21, No. 1
  • Anatomy Terminology Performance is Improved by Combining Jigsaws, Retrieval Practice, and Cumulative Quizzing 20 October 2020 | Anatomical Sciences Education, Vol. 14, No. 5
  • Knowledge, attitudes, perceptions and practices of community pharmacists about generic medicine in Nigeria 7 October 2020 | Journal of Generic Medicines: The Business Journal for the Generic Medicines Sector, Vol. 17, No. 2
  • Measuring attitudes towards biology major and non-major: Effect of students’ gender, group composition, and learning environment 14 May 2021 | PLOS ONE, Vol. 16, No. 5
  • Combining Jigsaws, Rule-Based Learning, and Retrieval Practice Improves IUPAC Nomenclature Competence 30 March 2021 | Journal of Chemical Education, Vol. 98, No. 5
  • Online Portfolio: An Alternative to a Research Paper as a Final Assessment † Journal of Microbiology & Biology Education, Vol. 22, No. 1
  • Perceived Student Learning, Attitudes And Experiential Learning In Software Engineering Education
  • A gendered lens to self-evaluated and actual climate change knowledge 24 October 2020 | Journal of Environmental Studies and Sciences, Vol. 11, No. 1
  • Improving Climate Change Awareness of Preservice Teachers (PSTs) through a University Science Learning Environment 18 February 2021 | Education Sciences, Vol. 11, No. 2
  • Validity Considerations in Data Collection and Use 22 April 2021
  • Factors Affecting Secondary Schools Students’ Attitudes toward Learning Chemistry: A Review of Literature Eurasia Journal of Mathematics, Science and Technology Education, Vol. 17, No. 1
  • Laboratuvar öz yeterlik ölçeği geliştirme çalışması 25 December 2020 | Yuzunci Yil Universitesi Egitim Fakultesi Dergisi, Vol. 17, No. 1
  • Design and Evaluation of the Environmental Outreach Activity for Middle School Students 22 September 2020 | ACS Omega, Vol. 5, No. 39
  • Assessment of primary care practitioners’ attitudes and interest in pharmacogenomic testing Pharmacogenomics, Vol. 21, No. 15
  • Understanding the impact of lifestyle on sustainable consumption behavior: a sharing economy perspective Management of Environmental Quality: An International Journal, Vol. 32, No. 1
  • “Taphonomy: Dead and fossilized”: A new board game designed to teach college undergraduate students about the process of fossilization 2 December 2019 | Journal of Geoscience Education, Vol. 68, No. 3
  • Item response curve analysis of Likert scale on learning attitudes towards physics 3 June 2020 | European Journal of Physics, Vol. 41, No. 4
  • Field courses narrow demographic achievement gaps in ecology and evolutionary biology 8 May 2020 | Ecology and Evolution, Vol. 10, No. 12
  • Implementation of a Renal Precision Medicine Program: Clinician Attitudes and Acceptance 26 March 2020 | Life, Vol. 10, No. 4
  • “I'm just not that great at science”: Science self‐efficacy in arts and communication students 29 November 2019 | Journal of Research in Science Teaching, Vol. 57, No. 4
  • Between Science Education and Environmental Education: How Science Motivation Relates to Environmental Values 4 March 2020 | Sustainability, Vol. 12, No. 5
  • Developing a Scale to Measure Students’ Attitudes toward Science 5 January 2020 | International Journal of Assessment Tools in Education, Vol. 6, No. 4
  • Measuring Attitudes: Current Practices in Health Professional Education 22 April 2021
  • Influence of training in inquiry-based methods on in-service science teachers’ reasoning skills 13 July 2019 | Chemistry Teacher International, Vol. 1, No. 2
  • Patricia Zagallo ,
  • Jill McCourt ,
  • Robert Idsardi ,
  • Michelle K. Smith ,
  • Mark Urban-Lurain ,
  • Tessa C. Andrews ,
  • Kevin Haudek ,
  • Jennifer K. Knight ,
  • John Merrill ,
  • Ross Nehm ,
  • Luanna B. Prevost , and
  • Paula P. Lemons
  • Elisabeth Schussler, Monitoring Editor
  • Learning Acetabular Fracture Classification using a Three‐Dimensional Interactive Software: A Randomized Controlled Trial 6 December 2018 | Anatomical Sciences Education, Vol. 12, No. 6
  • An exploration of Illinois students' attitudes toward science using multivariate multilevel modeling with a cross‐sectional sample of responses from grades 5 through 10 2 April 2019 | Journal of Research in Science Teaching, Vol. 56, No. 8
  • How situational interest affects individual interest in a STEAM competition 1 June 2019 | International Journal of Science Education, Vol. 41, No. 12
  • Understanding physical phenomena through simulation exercises Journal of Physics: Conference Series, Vol. 1286, No. 1
  • Development of a Tool for the Assessment of Nurses' Attitudes Toward Delirium 11 September 2019 | Journal of Nursing Measurement, Vol. 27, No. 2
  • Comparing Likert Scale Functionality Across Culturally and Linguistically Diverse Groups in Science Education Research: an Illustration Using Qatari Students’ Responses to an Attitude Toward Science Survey 11 April 2018 | International Journal of Science and Mathematics Education, Vol. 17, No. 5
  • Are all rivers equal? The role of education in attitudes towards temporary and perennial rivers 10 April 2019 | People and Nature, Vol. 1, No. 2
  • The perceptions of gynecologic oncology fellows on readiness for subspecialty training following OB/GYNRESIDENCY Gynecologic Oncology Reports, Vol. 28
  • THE EFFECT OF LOWER-SECONDARY CHEMISTRY EDUCATION: STUDENTS’ UNDERSTANDING TO THE NATURE OF CHEMISTRY AND THEIR ATTITUDES 20 April 2019 | Journal of Baltic Science Education, Vol. 18, No. 2
  • Use of Patient-Reported Outcome Measures in Athletic Training: Common Measures, Selection Considerations, and Practical Barriers Journal of Athletic Training, Vol. 54, No. 4
  • Fellow Perceptions of Residency Training in Obstetrics and Gynecology Journal of Surgical Education, Vol. 76, No. 1
  • Simulation course – A little bit something for everybody? Procedia Computer Science, Vol. 159
  • Interaction between bottlenose dolphins (Tursiops truncatus) and artisanal fisheries in the Valencia region (Spanish Mediterranean Sea) Ocean & Coastal Management, Vol. 165
  • Assessment of knowledge, attitude and practice towards sustainable consumption among university students in Selangor, Malaysia Sustainable Production and Consumption, Vol. 16
  • Student Attitudes toward Active Learning vs. Lecture in Cell Biology Instruction The American Biology Teacher, Vol. 80, No. 8
  • The Perfect uberPOOL: A Case Study on Trade‐Offs 24 January 2019 | Ethnographic Praxis in Industry Conference Proceedings, Vol. 2018, No. 1
  • Motivational decline and recovery in higher education STEM courses 20 April 2018 | International Journal of Science Education, Vol. 40, No. 9
  • All metrics are equal, but some metrics are more equal than others: A systematic search and review on the use of the term ‘metric’ 6 March 2018 | PLOS ONE, Vol. 13, No. 3
  • Understanding attitude measurement: exploring meaning and use of the PATT short questionnaire 21 December 2016 | International Journal of Technology and Design Education, Vol. 28, No. 1
  • Creation of an educational quality improvement program for radiation oncology residents Practical Radiation Oncology, Vol. 8, No. 2
  • Development and validation of an instrument to assess student attitudes toward science across grades 5 through 10 26 July 2017 | Journal of Research in Science Teaching, Vol. 55, No. 2
  • Introduction: Business-to-Business Marketing in China: Digital or Traditional? 18 July 2018
  • Experience-Centred Design and the Role of Computer-Aided Tools in the Creative Process
  • Julia Gouvea
  • Engaging Engineering Students in Geoscience Through Case Studies and Active Learning 12 June 2018 | Journal of Geoscience Education, Vol. 65, No. 3
  • Inquiry-Based Chemistry Education in a High-Context Culture: a Qatari Case Study 22 March 2016 | International Journal of Science and Mathematics Education, Vol. 15, No. 6
  • DEVELOPMENT OF A SKILLS-BASED INSTRUMENT TO MEASURE SCIENTIFIC REASONING IN MEDICINE ACROSS DIFFERENT LEVELS OF EXPERTISE 25 June 2017 | Journal of Baltic Science Education, Vol. 16, No. 3
  • Embedded Mathematics in Chemistry: A Case Study of Students’ Attitudes and Mastery 1 September 2016 | Journal of Science Education and Technology, Vol. 26, No. 1
  • Computers in Human Behavior, Vol. 75
  • Studies in Educational Evaluation, Vol. 55
  • Students’ attitudes, self-efficacy and experiences in a modified process-oriented guided inquiry learning undergraduate chemistry classroom 1 January 2017 | Chemistry Education Research and Practice, Vol. 18, No. 2
  • Evaluation of teaching approach and student learning in a multidisciplinary sustainable engineering course Journal of Cleaner Production, Vol. 142
  • “I Fall Asleep in Class … But Physics Is Fascinating”: The Use of Large-Scale Longitudinal Data to Explore the Educational Experiences of Aspiring Girls in Mathematics and Physics 24 October 2016 | Canadian Journal of Science, Mathematics and Technology Education, Vol. 16, No. 4
  • Public Understanding of Science and K-12 STEM Education Outcomes: Effects of Idaho Parents’ Orientation Toward Science on Students’ Attitudes Toward Science 11 January 2017 | Bulletin of Science, Technology & Society, Vol. 36, No. 3
  • Kristin M. Bass ,
  • Dina Drits-Esser , and
  • Louisa A. Stark
  • Ross Nehm, Monitoring Editor
  • Georgianne L. Connell ,
  • Deborah A. Donovan , and
  • Timothy G. Chambers
  • Michelle Smith, Monitoring Editor:
  • Students’ epistemologies about experimental physics: Validating the Colorado Learning Attitudes about Science Survey for experimental physics 21 March 2016 | Physical Review Physics Education Research, Vol. 12, No. 1
  • Measuring adolescent science motivation 7 March 2016 | International Journal of Science Education, Vol. 38, No. 3
  • Refining questionnaire-based assessment of STEM students’ learning strategies 12 August 2015 | International Journal of STEM Education, Vol. 2, No. 1
  • Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis 18 August 2015 | Journal of Chemical Education, Vol. 92, No. 11
  • Kim Quillin , and
  • Stephen Thomas
  • Mary Lee Ledbetter, Monitoring Editor
  • Digital collaborative learning: identifying what students value 20 March 2015 | F1000Research, Vol. 4
  • Thomas D. Wolkow ,
  • Lisa T. Durrenberger ,
  • Michael A. Maynard ,
  • Kylie K. Harrall , and
  • Lisa M. Hines
  • Jeff Schinske, Monitoring Editor
  • Dealing with disaffection: the influence of work-based learning on 14–16-year-old students’ attitudes to school 7 November 2014 | Empirical Research in Vocational Education and Training, Vol. 6, No. 1
  • Leslie M. Stevens , and
  • Sally G. Hoskins
  • Nancy Pelaez, Monitoring Editor
  • Gloriana Trujillo , and
  • Kimberly D. Tanner
  • Effectiveness and Sustainability of Education about Incident Reporting at a University Hospital in Japan Healthcare Informatics Research, Vol. 20, No. 3
  • Industry Supplied CAD Curriculum and Team Project-Based Learning: Case Study on Developing Design, Problem-Solving, Communication, and Group Skills

Submitted: 15 November 2012 Revised: 10 July 2013 Accepted: 12 August 2013

© 2013 M. Lovelace and P. Brickman. CBE—Life Sciences Education © 2013 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

Classroom Q&A

With larry ferlazzo.

In this EdWeek blog, an experiment in knowledge-gathering, Ferlazzo will address readers’ questions on classroom management, ELL instruction, lesson planning, and other issues facing teachers. Send your questions to [email protected]. Read more from this blog.

What Students Are Really Thinking About Online Learning

attitude towards learning essay

  • Share article

Today, several students from my classes “wrap things up” in the final post of this series.

“The temptations are REAL!”

Lee Xiong is a junior at Luther Burbank High School:

School has been tough. Transferring to all online learning has been the biggest challenge this year for me. As a student, I’d say I’ve usually kept up with all my work for all my classes. The biggest change I’ve seen in myself is becoming less focused with my school work.

Being in a physical classroom is tremendously different from learning online. In a classroom, most of your focus is there, unlike virtually, the temptations are REAL! Yes, self-discipline is good to learn, but when having all this thrown at you, you can’t blame the student for not wanting to work... at least that’s my opinion.

This online learning has affected me personally because during this time, I found myself turning in assignments weeks late. It wasn’t because I was having trouble, it was because I had no motivation and energy to do them. This isn’t the norm for me. Without a routine schedule, I felt lost. That makes me sound like a robot, but I think it’s because it’s been that way since we were so small, change this big is affecting me to the max.

This has taught me that online learning will not be for me in the future! Maybe for one or two classes, but overall I plan for my school life to be set in a physical classroom for the most part. Although this has been a challenging time for school and out in the real world, remembering to stand tall will get us through this together.

attitude towards learning essay

“Learning at school is best for me”

Evelynn Vang is a junior at Luther Burbank High School:

The online learning experience as a student for me has been fine. I sometimes find myself not interested in doing my assignments and I feel like I’m lazy. I still do the assignments, but I sometimes end up turning in my assignments late. It’s like I’ll do the assignments whenever I feel like doing it.

I can say that there is a reason for this, and that is where I am doing my school work. My home is not a learning environment like at school, where there are teachers, other students, learning tools, desks/tables, chairs, a library, lots of space, and those who you can get support from. At home is like a sleeping or resting environment. In a classroom, I can focus more on my assignments/work and get engaged in the subject. Whenever I’m in a classroom, I feel prepared to learn and get my brain pumped; at home, I feel like it’s very hard to be prepared because I’m always getting distracted. Whenever I need help, my teachers or classmates are there for me. When I have a question at home, I have to wait for a response.

I do have to say that whenever I’m at school, I always feel nervous in class. Now that I’m at home learning, I don’t feel nervous. From my online learning experience right now, I would not choose more online learning in the future because in a school, a classroom is a learning environment. Also, I feel like it’s easier to communicate with my classmates/groups for projects, teachers, counselors, and principal. Learning at a school is best for me.

"At home I feel like it's very hard to be prepared because I'm always getting distracted."

“I have many responsibilities at home”

Diana Lopez is a junior at Luther Burbank High School:

As a student, my online learning experience hasn’t been great. This new learning system has its perks, such as more time to do assignments in the comfort of your home, not having to wake up so early to go to school, and ensuring the safety of the staff as well as the students. Despite these benefits, there are downsides of this method of learning. For example, I have many responsibilities at home, such as taking care of my younger siblings, cooking meals, cleaning up after them, etc. I also find it harder to have any motivation when I’m doing school assignments. When I’m surrounded by all these other temptations like my phone or other electronics, I lose any will to do work.

The environment at home is different from the workspace students have at school. A classroom provides a quiet academic place to do work while a household can be loud and cause students to lose concentration or not even work at all. Additionally, I find that simply reading the instructions for an assignment or lesson isn’t as engaging as when it’s explained by a teacher. The information is much easier to retain when heard rather than simply rushing to read the directions. If I could choose, in the future I would not like to do more online learning because I like having a teacher physically there to help me when I need it. Having a teacher presence helps me focus more on school work, engages me into learning, and the teachers help guide me through the work and are there for any questions I have.

attitude towards learning essay

“Online learning has been difficult”

Isabella Sandoval is a junior at Luther Burbank High School:

Online learning has been difficult. I feel pressured to try and hurry to finish and turn in all of my assignments on time. Most of my assignments are due at the same time, and a lot of them are time- consuming.

Though, for the most part it’s difficult to adapt to since I’ve had my education in person with my teachers and classmates, I like how I can do the assignments on my own time. I could divide the day and time I complete my work, I can sleep in a little longer, and overall just be comfortable while in my own home. I feel that online learning is nothing compared to physical learning. With physical learning, I can talk to my teachers one on one and visually see and interact with everything. Whereas online, when I have a question, I either have to email or text my teachers, and sometimes they don’t see my message and/or take forever to respond.

In the future, I honestly would not mind doing online learning. Just for a little bit though, because it’s not that bad, it’s just the fact that I can’t physically talk to my teachers in person when I need help or have questions. Communicating with teachers online is what I feel is the most difficult part about online learning.

attitude towards learning essay

“My online experience has been interesting”

Brenda Hernandez is a junior at Luther Burbank High School:

As a student, my online experience has been interesting. What I like about this experience is that I have more time to talk to my family and call or text some friends. I get to do school work from home and I have time for self-care. I like that I kind of get to choose which classes I should work on first and which I could wait to do after.

What I don’t like about it is that I am on a screen all day. I like electronics, but school has kept me from staring at a screen for hours. I also don’t like that I have more distractions at home. I live in a small apartment with five other people and four dogs.

This experience is different from being in a physical classroom because I socialize less now. In school, I get to hear the opinions and ideas of my friends and classmates. Some of my teachers would tell us to talk to the people around us about the lesson. Now, not everyone’s online at the same time. I have anxiety, which prevents me from texting some friends and some of my classmates. And if I did, they’d take a while to respond. Same with communicating with teachers.

In the future, if I could choose, I’d like to do a bit of online learning and the rest in an actual classroom. Although it depends on the class. I have noticed that some of the classes I’ve been able to complete at home since there isn’t anyone asking questions or reading the directions to stall me from beginning my work. In other classes, it has been more difficult since I’m more of a visual learner for that subject, and my teachers keep me on task.

attitude towards learning essay

“My online learning experience hasn’t been the best but not worst experience”

Laitak Briand is a junior at Luther Burbank High School:

Being an engaging student during quarantine has been difficult. There have been a lot things that happened during the first weeks since school was canceled. Stores began to close down, parks being shut down, and people told to stay in the house 24-7 unless they needed their necessities.

What I liked about it, though, is that I have more time to do things that I said I wanted to do if I only had time. Now I have time to do things like spend time with family and resting. What I don’t like about online learning is that I have to still do homework even though we are in a pandemic and can’t leave the house.

The experience from doing online learning and going to school physically are vastly different. With online classes, if you need help you have to ask your parents or google. But when you go to school, there is a teacher that can help you. Also, my friends I can’t physically see them when I’m at home, but if I went to school, I could. In the future, if I had to choose to continue online learning or not, I’d choose not because I like to be somewhere I can ask someone near me for help and see if I did something right or wrong. In conclusion, my online learning experience hasn’t been the best but not worst experience I have ever had.

"Now I have time to do things like spend time with family and resting."

“There is nothing that I liked about it besides how supportive the teachers have been”

Na Lee Her is a junior at Luther Burbank High School:

My experience with online learning is very stressful and hard. I felt this way because of how hard it is for me to understand the assignments and having to not be able to check with your teacher face to face if you are doing it correctly or not. It doesn’t make me confident because I want to make sure that I am actually doing the assignment correctly in order to deserve the credit for it.

Not only that, but having time to do the assignments is another problem. At home, there are many things to take care of, and it makes it hard for me to be able to do my assignments. This makes me turn in the assignment late or not turn it in at all. Last but not least, it is the lack of motivation that makes online learning hard. Not being able to be face to face with friends and teachers gives me no motivation and makes me unhappy about this. I am unable to get ideas from them, and it makes me lose hope because I don’t know what I will do to be able to complete the assignment and meet its requirement. It just makes me very worried and anxious to know that I may have done things wrong or to not know what to do.

During this time of online learning, there is nothing that I liked about it besides how supportive the teachers have been. If I were to choose online learning or learning face to face, I would rather choose learning face to face. I choose this because it is much easier and I get my questions answered right away. Not only that, but I can also get suggestions/ideas from my peers as well.

"My experience with online learning is very stressful and hard."

Thanks to Lee, Evelynn, Diana, Isabella, Brenda, Laitak, and Na Lee for their contributions!

(This is the final post in a multipart series. You can see Part One here , Part Two here , and Part Three here .)

Here is the new question-of-the-week:

What has your online learning experience been as a student? What did you like about it? What didn’t you like about it? How does it compare with your experience as a student in a physical classroom? In the future, if you could choose, would you want to do more online learning? If so, why? If not, why not?

In Part One , five students from the high school where I teach in Sacramento, Calif., shared their reflections.

In Part Two , contributions come from students in Austin Green’s 1st grade class in Utah and others connected with the Kansas State School for the Blind.

In Part Three , contributors came from my class; Ryan Jakacki’s class in Plymouth, Minn.; and Anne Magnin’s class in France.

Consider contributing a question to be answered in a future post. You can send one to me at [email protected] . When you send it in, let me know if I can use your real name if it’s selected or if you’d prefer remaining anonymous and have a pseudonym in mind.

You can also contact me on Twitter at @Larryferlazzo .

Education Week has published a collection of posts from this blog, along with new material, in an e-book form. It’s titled Classroom Management Q&As: Expert Strategies for Teaching .

If you missed any of the highlights from the first eight years of this blog, you can see a categorized list below. The list doesn’t include ones from this current year.

This Year’s Most Popular Q&A Posts

Race & Gender Challenges

Classroom-Management Advice

Best Ways to Begin the School Year

Best Ways to End the School Year

Implementing the Common Core

Student Motivation & Social-Emotional Learning

Teaching Social Studies

Cooperative & Collaborative Learning

Using Tech in the Classroom

Parent Engagement in Schools

Teaching English-Language Learners

Reading Instruction

Writing Instruction

Education Policy Issues

Differentiating Instruction

Math Instruction

Science Instruction

Advice for New Teachers

Author Interviews

Entering the Teaching Profession

The Inclusive Classroom

Learning & the Brain

Administrator Leadership

Teacher Leadership

Relationships in Schools

Professional Development

Instructional Strategies

Best of Classroom Q&A

Professional Collaboration

Classroom Organization

Mistakes in Education

Project-Based Learning

I am also creating a Twitter list including all contributors to this column .

The opinions expressed in Classroom Q&A With Larry Ferlazzo are strictly those of the author(s) and do not reflect the opinions or endorsement of Editorial Projects in Education, or any of its publications.

Sign Up for EdWeek Tech Leader

Attitudes towards Education Essay

Introduction, students working to finance their education, works cited.

Education is one of the most useful and long lasting assets that can be given to children. It is usually passed from one person to another either formally or informally. It usually equips the students with knowledge on how to handle various challenges in life either in work or in the social sphere.

Due to the value attached to it, governments in some countries usually offer it to the citizens free of charge or at subsidized prices. There are rules that govern its undertaking to ensure that it is availed to all and the implementation is smooth. In most developing countries the parents usually pay the fees for their children with minimal or no assistance from the government.

In these countries due to lack of resources, the rates of illiteracy is usually high. Most of the children engage in economic activities and quit school. This paper seeks to compare and contrast how are the attitudes toward education different among students who work to finance their own education and students who do not vary outlining the differences and the reason for their existence.

There are student who usually work to finance their education. These students mainly engage themselves to work while still studying in order to cater for the deficit in their school fees. This arises from the competing needs for school fees, housing fee and general maintenance in the presence of limited funds to cater for them. In the current times, the number of students who work while still undertaking their education is increasing.

This is usually found to have some effects on their academic performance and social life. The effects might be positive or negative depending on the students. In case the students get a well paying job, he might decide to work and put the education on hold. This is usually found to have negative effects on his attitude towards education. This is found to vary between various students and is based on the student’s long term objectives rather than the short term objectives (Moschetti 8).

In the school attendance, students who work to finance their education are sometime inconsistent in attending their classes. These students usually undertake part-time jobs and in times when working hours collide with class time, the usually opt to miss the classes and go to work.

This is usually aimed at maintaining the income source (Wagdarikar et al. 10). They usually spend their free time in covering whatever was covered in their absence. In the case of the students who usually have their fees fully catered for, they are most likely to attend all the classes.

Due to the necessity of having to undertake the class work, this usually makes them improve their attitude towards education. This is mainly due to limited time to cater for the class work. This is not usually the case as some work is obsessive and the student may find him/herself neglecting his academic duty.

Students who participate in work during their studies usually interact with various people. In this interaction they usually learn various aspect of life which they would not have learnt in class. They usually meet various challenges. These challenges usually give them a chance to learn how to solve similar challenges they may encounter in future. This thus implies that their mind develops all way round.

They usually have real life experiences of the class work and are therefore most likely to understand various concepts better. This challenges necessitates the use of knowledge to overcome them and thus can improve the students appreciation for the knowledge and thus improve his attitude towards it. For the students who do not work while studying, they usually face many challenges when they go into the job market due to lack of sufficient exposure in the field in which they were studying.

Leadership is learnt through practice as well observation. When students are working, they usually learn how to handle and manage various issues. These issues require various tactics of handling them and indicate the importance of education in the job market. The other employees usually provide them with guidance incase they are stuck in some issues.

Also they usually learn leadership from senior employees by checking on how they handle various challenging issues. In undertaking their duties, the students may be placed to be in charge of other employees depending on his knowledge and skills. This usually provides a good platform for the development of managerial skills. The managerial skills learnt at work are applied at school and usually have positive results (Williamson 6).

At work place, there is diversity of people. Working students usually have time to learn how to interact and relate with other people. This usually helps them to develop socially. At school there is little possibility of the fellow students changing their social interaction. At work, the people usually change work and the presence of new employees leads to the development of socials skills.

Due to the increased interaction and knowledge, student who work while studying are usually equipped with enough knowledge on the academic reward in the job market. This usually makes them study trying to improve their weak points which could be more rewarding in future. This knowledge helps them in making good choice of the units to study.

The students who work to finance their studies usually have a greater affection towards education. Initially they may be lacking the proper attitude towards education but due to experiences and challenges at work place, they usually appreciate the importance of education and thus improve their good attitude towards it.

Moschetti, Ram. Understanding the experiences of white, working-class, first-generation community college students. Santa Barbara: University of California, 2008. Print.

Wagdarikar et al. Organizational Citizenship Behavior and Perception of Organizational Justice in Student Jobs . Munich: GRIN Verlag, 2010. Print.

Williamson, Kim. Working students: an exploration of young people’s experiences combining tertiary education and employment . Auckland, New Zealand: University of Auckland, 2006. Print.

  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2018, September 7). Attitudes towards Education. https://ivypanda.com/essays/attitudes-towards-education/

"Attitudes towards Education." IvyPanda , 7 Sept. 2018, ivypanda.com/essays/attitudes-towards-education/.

IvyPanda . (2018) 'Attitudes towards Education'. 7 September.

IvyPanda . 2018. "Attitudes towards Education." September 7, 2018. https://ivypanda.com/essays/attitudes-towards-education/.

1. IvyPanda . "Attitudes towards Education." September 7, 2018. https://ivypanda.com/essays/attitudes-towards-education/.

Bibliography

IvyPanda . "Attitudes towards Education." September 7, 2018. https://ivypanda.com/essays/attitudes-towards-education/.

  • Florida Higher Education: Fees and Tuition Rise
  • History of Drug Use and Lessons Learnt
  • Several Lessons Important in Building an Impressive Portfolio
  • Savage Inequalities by Jonathan Kozol
  • Courses Outside of a Student's Major
  • Theory, Methodology and Human Development: HIV/AIDS and Education in African Countries
  • Exploring the International Students as a Community
  • Peer Assessment in Higher Education

Develop a Positive Attitude to Learning: 11 Strategies to Empower Your Mind

by Laura | Nov 17, 2023 | Positive Thinking

A positive attitude to learning is integral to being a lifelong learner. It involves an optimistic and proactive approach to acquiring new knowledge, skills, and experiences, even when confronted with challenges or setbacks.

This willingness to continuously seek knowledge and grow contributes significantly to personal and professional development, fostering resilience and open-mindedness.

In this blog post, we will explore 11 effective strategies for cultivating an appreciation for learning and finding joy in the journey of self-improvement.

11 Strategies for Developing a Positive Attitude To Learning

These strategies are not only beneficial for academic success but also for personal growth and professional development. They are designed to spark your curiosity, enhance your resilience, and empower your mind to embrace lifelong learning.

1. Embrace a Growth Mindset

A growth mindset is a belief in your ability to learn and grow. It’s seeing challenges as opportunities for personal development. This perspective propels you forward and encourages you to step out of your comfort zone.

It also prompts you to view mistakes not as failures but as stepping stones on your learning journey. Each setback is an opportunity to learn and a chance to identify areas for improvement. It’s a signal to reassess and adjust your strategy.

When you embrace this mindset, negative experiences become fertile ground for growth.

2. Set Clear and Achievable Goals

Begin by introspecting about your aspirations. Ask yourself, “What are my true goals in life?” Then, challenge yourself with projects to achieve those goals.

Try to stay motivated and excited about what you’re working towards. This way you’re more likely to keep a positive attitude when it gets tough.

It’s also important to reward yourself when you meet your goals. These rewards will provide positive reinforcement and make the process more enjoyable.

3. Cultivate Curiosity

Cultivating curiosity means fostering a desire to explore the world around you and to seek new information.

Allow your mind to explore unfamiliar territory and question the status quo. Revel in the diversity of human knowledge and culture. The more open you are to new ideas, the richer your learning experience will be.

Also, try to enjoy the learning process, not only the outcome. This can make the journey as rewarding as the destination.

So, be relentless in your pursuit of knowledge and savor the joy of discovery. After all, learning is a lifelong journey.

4. Practice Positive Self-Talk

It’s easy to fall prey to negativity, especially when faced with setbacks. To counter this, practice affirmations that reinforce your abilities. Remind yourself that you’re on an ongoing learning journey.

Instead of thinking “I can’t do this,” tell yourself “I can learn how to do this.” Remember that mistakes are valuable learning opportunities. Focus on the progress you’ve made so far and the new skills you’ve gained. Keep your spirits high and stay motivated.

5. Foster Resilience

Fostering resilience is crucial in your learning journey. Encountering obstacles is natural, but your response shapes your learning experience.

Consistency is the first step. It helps you build good habits, keeping you engaged and focused. This makes it easier to rebound from setbacks.

Then adaptability helps you accept change rather than resist it.

So, a consistent routine provides stability. Adaptability keeps you flexible to change. Together, they equip you with the resilience to bounce back from challenges.

6. Maintain a Healthy Lifestyle

Having a healthy lifestyle is vital for effective learning and cognitive development:

  • Physical exercise is good for your health and improves your brain function. The increased blood flow to the brain eases the delivery of oxygen and nutrients crucial for good cognitive performance.
  • Eating nutritious foods provides the vitamins and minerals needed for brain health. This article by the Mayo Clinic Health System explains how certain foods improve your memory, focus, and cognitive abilities.
  • Finally, adequate rest is crucial for transforming short-term memories into long-term ones.

7. Practice Meditation

Meditation encourages a state of active, open attention to the present. It allows you to stay focused on the task at hand and better absorb information.

Moreover, learning can sometimes be overwhelming, leading to stress and anxiety. So with meditation, you can learn to manage these feelings.

Start with a few minutes daily. Then increase the duration as you get more comfortable with the practice. The goal is to observe your thoughts without judgment to foster a sense of calm and focus.

8. Make Learning Fun

Enjoying your learning experience can enhance your engagement and motivation.

Consider incorporating elements of play or competition. You could set up friendly competitions with peers or adopt a gamified learning app. Transforming challenges into games can make learning more vivid and exciting.

So try merging enjoyable elements with learning. This will make your study sessions something to look forward to.

9. Surround Yourself with Positive Influences

Creating a comfortable study space is essential for effective learning. So design a spot that inspires you to delve into your work with a fresh mind.

Bright, well-lit spaces with minimal distractions work best.

Decorate your space with items that bring you joy and inspire creativity. Incorporate elements that elevate your mood and enhance your productivity.

10. Cultivate a Love for Reading

Cultivating a love for reading can be a game-changer in your learning journey.

Reading has the power to introduce you to new ideas and broaden your perspectives. By diving in a variety of genres you can enrich your knowledge base and stimulate your creative thinking.

Start with books or topics that interest you, and then expand your reading range.

Every book you read is an opportunity to learn something new. That’s the beauty of books – they’re a never-ending source of knowledge, inspiration, and joy.

11. Connect Learning to Real Life

One of the best ways to cement your learning is to tie it to real-world applications. By doing so, your new knowledge becomes more relevant and easier to remember.

By applying what you’ve learned you get a deeper understanding of the subject matter. So, start putting concepts into practice. This approach will make what you learn more impactful and lasting.

Everything you learn has the potential to serve a practical purpose. Every experience becomes an opportunity to apply your knowledge and gain more insights.

Finally, never forget why you started. Reminding yourself of your goals and motivations will help you stay on track and keep pushing forward.

Every learning journey has its ups and downs. But with the right mindset and habits, it can be a fulfilling and transformative experience.

Related article: Positive Thinking: Your Ultimate Guide to Cultivating a Positive Mindset for Personal Growth

Benefits of Having a Positive Attitude To Learning

Embracing a positive attitude to learning has a lot of benefits. Among these is making your educational journey more enjoyable and fruitful.

Here’s why cultivating this mindset matters:

  • Amplified motivation: A good mindset increases your motivation and becomes your driving force. It gives you the determination to pursue your goals.
  • Career avenues: Eagerness to learn new skills can open doors to career opportunities and promotions. Plus, it can pave the way for personal growth and development.
  • Increased self-confidence: A positive learning attitude boosts self-confidence, providing a sense of accomplishment and self-worth.
  • Fueled creativity: Positivity ignites creativity, encouraging you to think beyond boundaries and develop problem-solving skills.
  • Sharper memory: A positive mindset enhances memory, making it easier to retain and recall information.

So, do your best to cultivate a positive attitude to learning. You’ll gain knowledge and pave the way for growth, academic success, and a more fulfilling life.

Wrapping It Up

Positive thinking isn’t only about optimism; it’s a key to navigating challenges. Beyond reducing stress, it enhances your problem-solving skills. It is also crucial for your mental well-being.

A positive attitude to learning can spark your creativity. It can also help you cultivate an encouraging attitude toward progress. With an enthusiastic approach, your successes can multiply and your confidence strengthens.

Moreover, this positive mindset acts as a catalyst for personal growth. It’s like a magnet, attracting positive vibes and opportunities.

In essence, fostering a positive attitude makes life more enjoyable and fulfilling. So, embrace positivity, and watch how it transforms your journey.

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Submit Comment

Related Posts

25 top self-compassion quotes: words to inspire kindness and self-care.

In today's world, where criticism overshadows self-care, self-compassion quotes remind us to treat ourselves with...

6 Easy Techniques to Practice Self-Compassion

In a world that pushes us towards perfection and productivity, learning techniques to practice self-compassion can be...

3 Simple Ways to Improve Your Self-Compassion Daily

Finding ways to improve self-compassion daily is an essential step toward nurturing mental and emotional well-being....

IMAGES

  1. Change the attitude Essay Example

    attitude towards learning essay

  2. (PDF) EFL Students' Attitude Toward Learning English

    attitude towards learning essay

  3. EDUCATION STUDENTS’ ATTITUDES TOWARDS LEARNING ENGLISH by SHERRY CAPALUKS

    attitude towards learning essay

  4. Attitude of Students Towards Education

    attitude towards learning essay

  5. PPT

    attitude towards learning essay

  6. (PDF) Attitude of Student Teachers Towards Learning Through Moodle

    attitude towards learning essay

VIDEO

  1. Budget debate: DPM Heng cautions against too much emphasis on SkillsFuture take-up rates

  2. Youth Voice on AI Matters

  3. IELTS

  4. Essay on Why I Love Learning!🚀🌈

  5. The Theory of Psychoanalysis by Carl Gustav Jung

  6. Fostering a Positive Attitude Towards Learning: A Lifelong Skill

COMMENTS

  1. Transforming students' attitudes towards learning through the ...

    Previous research shows that there is a correlation between attitudes and academic achievement. In this article, we analyze for the first time the impact of interactive groups (IG) and dialogic literary gatherings (DLG) on the attitudes that students show towards learning. A quantitative approach has been performed using attitude tests validated by previous research. The data suggest that in ...

  2. PDF Students' Attitudes Towards Learning, A Study on Their Academic

    the same focal point: attitudes of students towards learning. To put it in other words, the fundamental reason that teaching objectives have not been achieved are due to the attitudes of the students towards learning (Glynn, Aultman, & Owens, 2005). The point that administrators, the teachers and the families responsible for the teaching activities

  3. Transforming students' attitudes towards learning through the use of

    The context of positive empowerment, based on the idea of "maximum expectations" [50,51], is able to transform the attitudes that students have towards learning (especially those who are more resistant to learning and school). In contexts where school and school practices are not valued, children have to overcome the social tendency to ...

  4. PDF Student Learning: Attitudes, Engagement and Strategies

    This considers what strategies students use during learning. Also of interest is how these strategies relate to motivational factors and students' self-related beliefs as well as to students' performance in mathematics. The chapter places considerable emphasis on comparing approaches to learning for males and females.

  5. PDF Teacher and Teaching Effects on Students' Attitudes and Behaviors

    and behaviors. These findings lend empirical evidence to well-established theory on the. multidimensional nature of teaching and the need to identify strategies for improving the full. range of teachers' skills. Keywords: teacher effectiveness, instruction, non-cognitive outcomes, self-efficacy, happiness, behavior. 1.

  6. Factors affecting the attitudes of students towards learning English as

    Generally speaking, the cumulative mean for all the above is 3.5 and according to the responses given for the above statements concerning the attitude of students towards learning English, and learner personality factors (statement 1.3, 1.7 and 1.8), students of large number (72%) have positive attitudes towards learning English language, they ...

  7. PDF Factors Affecting Secondary Schools Students' Attitudes toward Learning

    attitude towards learning of chemistry among secondary schools' students, and their findings showed that the attitude is directly linked to the academic achievement and the attitude is a predictor of behavior. It has also been reported that students with a positive attitude are trying to excel in the subject being taught ...

  8. PDF UNIVERSITY STUDENTS' ATTITUDES TOWARD ONLINE LEARNING IN A ...

    learning strategies; (e) attitudes toward ICT use; (f) attitudes toward the subject area; and (g) attitudes toward online learning. The prior experience of ICT usage indicated the frequency of the students' application of various ICT tools. A six-point scale was used with "1= never", "2 = rarely", "3 = once a month", "4 = once a

  9. Best Practices for Measuring Students' Attitudes toward Learning

    However, students' perceptions of courses and attitudes toward learning play a significant role in retention and enrollment (Seymour and Hewitt, 1997; Gasiewski et al., 2012). Motivation has a strong direct effect on achievement ( Glynn et al ., 2007 ), and, in some courses, students' attitudes may provide a better predictor of success than ...

  10. The Attitudes of University Students Towards Learning

    Attitudes towards learning are important factors on the learners' levels of goal setting, problem solving abilities, their beliefs towards learning, their inner and external motivations in the process of learning and all the academic performances they perform. In this study, it is aimed to analyze the attitudes of university students in terms ...

  11. Attitude toward School and Learning and Academic Achievement of Adolescents

    Relying on existing findings and results of researches we assume that: 1. Attitude towards school and learning (ATSL; and its individual components consistent with the ABC model) is significantly related to academic achievement expressed as GPA, 2. Attitude towards school and learning (ATSL) is a predictor of academic achievement (AA: GPA), 3.

  12. What can we learn from teenagers' approaches and attitudes toward school?

    Students aren't passive receptacles of education "content," no matter how exciting the lesson; their active inputs, like effort, persistence, and other positive habits, are just as vital to engagement and learning. As Michael Petrilli and one of us pointed out recently in Education Next, "When students work harder, they learn more.".

  13. What Students Are Really Thinking About Online Learning

    This new learning system has its perks, such as more time to do assignments in the comfort of your home, not having to wake up so early to go to school, and ensuring the safety of the staff as ...

  14. Attitudes towards Education

    1 hour! Due to the necessity of having to undertake the class work, this usually makes them improve their attitude towards education. This is mainly due to limited time to cater for the class work. This is not usually the case as some work is obsessive and the student may find him/herself neglecting his academic duty.

  15. Full article: Argumentation Competence: Students' Argumentation

    Measurement of attitude toward argumentation. Students' attitude toward argumentation was measured using a revised version of the questionnaire designed and employed previously by Noroozi, Biemans, Weinberger, Mulder, and Chizari (Citation 2013). The questionnaire was comprised of 20 items on a five-point Likert scale ranging from "Strongly ...

  16. Exploring the Correlation between Students' Attitudes towards AI and

    ABSTRACT: This research study aims to investigate the intricate connection between students' attitudes toward Artificial. Intelligence (AI) and the correlates o f their learning o utcomes. With ...

  17. Positive Attitude to Learning: 11 Strategies to Empower Your Mind

    Fueled creativity: Positivity ignites creativity, encouraging you to think beyond boundaries and develop problem-solving skills. Sharper memory: A positive mindset enhances memory, making it easier to retain and recall information. So, do your best to cultivate a positive attitude to learning.

  18. PDF Students' Attitudes Towards Learning Mathematics: Impact of Teaching in

    Most of the research investigated the effects of sports on academic outcomes (Robinson, 2012). The purpose of this study was to investigate if teaching in a sporting context would have an impact on students' attitudes towards learning mathematics. The data reported in this paper comes from a larger study (Sanchal, 2016).

  19. (PDF) STUDENTS ATTITUDE TOWARDS E-LEARNING

    This study tries to learn attitude of diploma engineering students towards adaption of e-learning. The present study is based on a survey method. Fifty six students of information technology ...

  20. Student Attitude Towards Learning English Language Essay

    The purpose of this study is to explore the attitudes of English learners in Sichuan University towards English language learning in the aspects of behavior, cognition, emotion and so on. Therefore, a questionnaire survey was conducted on 30 randomly selected samples. The results of qualitative and quantitative data analysis show that there are ...

  21. Attitudes of Senior High School Students towards Research: An

    The students reported a good satisfaction towards the relevance of research (Mean = 71.34 ± 20.0) research learning skills acquired (63.43 ± 18.9), research supervisor (68.47 ± 23.5), with ...

  22. PDF Attitudes towards English Language Learning among EFL Learners at ...

    The survey focused on their attitude towards English learning and causes that might have hindered their learning. For data collection, a 19 item questionnaire were designed and administered on 238 students. The objective of this study was to investigate (1) The attitudes of the learners towards the use of English in different areas; (2)

  23. PDF Attitudes of Filipino Senior High School Students Towards English

    setting of the school affected their attitudes towards learning EFL. However, Doğan's (2016) study opposed this, stating that there was a significant difference in the attitudes of the students towards English when grouped by school type. This is also true with the work done by İnal, Evin, and Saracaloğlu (2000). Statement of the Problem