Qualitative vs. Quantitative Research: Comparing the Methods and Strategies for Education Research

A woman sits at a library table with stacks of books and a laptop.

No matter the field of study, all research can be divided into two distinct methodologies: qualitative and quantitative research. Both methodologies offer education researchers important insights.

Education research assesses problems in policy, practices, and curriculum design, and it helps administrators identify solutions. Researchers can conduct small-scale studies to learn more about topics related to instruction or larger-scale ones to gain insight into school systems and investigate how to improve student outcomes.

Education research often relies on the quantitative methodology. Quantitative research in education provides numerical data that can prove or disprove a theory, and administrators can easily share the number-based results with other schools and districts. And while the research may speak to a relatively small sample size, educators and researchers can scale the results from quantifiable data to predict outcomes in larger student populations and groups.

Qualitative vs. Quantitative Research in Education: Definitions

Although there are many overlaps in the objectives of qualitative and quantitative research in education, researchers must understand the fundamental functions of each methodology in order to design and carry out an impactful research study. In addition, they must understand the differences that set qualitative and quantitative research apart in order to determine which methodology is better suited to specific education research topics.

Generate Hypotheses with Qualitative Research

Qualitative research focuses on thoughts, concepts, or experiences. The data collected often comes in narrative form and concentrates on unearthing insights that can lead to testable hypotheses. Educators use qualitative research in a study’s exploratory stages to uncover patterns or new angles.

Form Strong Conclusions with Quantitative Research

Quantitative research in education and other fields of inquiry is expressed in numbers and measurements. This type of research aims to find data to confirm or test a hypothesis.

Differences in Data Collection Methods

Keeping in mind the main distinction in qualitative vs. quantitative research—gathering descriptive information as opposed to numerical data—it stands to reason that there are different ways to acquire data for each research methodology. While certain approaches do overlap, the way researchers apply these collection techniques depends on their goal.

Interviews, for example, are common in both modes of research. An interview with students that features open-ended questions intended to reveal ideas and beliefs around attendance will provide qualitative data. This data may reveal a problem among students, such as a lack of access to transportation, that schools can help address.

An interview can also include questions posed to receive numerical answers. A case in point: how many days a week do students have trouble getting to school, and of those days, how often is a transportation-related issue the cause? In this example, qualitative and quantitative methodologies can lead to similar conclusions, but the research will differ in intent, design, and form.

Taking a look at behavioral observation, another common method used for both qualitative and quantitative research, qualitative data may consider a variety of factors, such as facial expressions, verbal responses, and body language.

On the other hand, a quantitative approach will create a coding scheme for certain predetermined behaviors and observe these in a quantifiable manner.

Qualitative Research Methods

  • Case Studies : Researchers conduct in-depth investigations into an individual, group, event, or community, typically gathering data through observation and interviews.
  • Focus Groups : A moderator (or researcher) guides conversation around a specific topic among a group of participants.
  • Ethnography : Researchers interact with and observe a specific societal or ethnic group in their real-life environment.
  • Interviews : Researchers ask participants questions to learn about their perspectives on a particular subject.

Quantitative Research Methods

  • Questionnaires and Surveys : Participants receive a list of questions, either closed-ended or multiple choice, which are directed around a particular topic.
  • Experiments : Researchers control and test variables to demonstrate cause-and-effect relationships.
  • Observations : Researchers look at quantifiable patterns and behavior.
  • Structured Interviews : Using a predetermined structure, researchers ask participants a fixed set of questions to acquire numerical data.

Choosing a Research Strategy

When choosing which research strategy to employ for a project or study, a number of considerations apply. One key piece of information to help determine whether to use a qualitative vs. quantitative research method is which phase of development the study is in.

For example, if a project is in its early stages and requires more research to find a testable hypothesis, qualitative research methods might prove most helpful. On the other hand, if the research team has already established a hypothesis or theory, quantitative research methods will provide data that can validate the theory or refine it for further testing.

It’s also important to understand a project’s research goals. For instance, do researchers aim to produce findings that reveal how to best encourage student engagement in math? Or is the goal to determine how many students are passing geometry? These two scenarios require distinct sets of data, which will determine the best methodology to employ.

In some situations, studies will benefit from a mixed-methods approach. Using the goals in the above example, one set of data could find the percentage of students passing geometry, which would be quantitative. The research team could also lead a focus group with the students achieving success to discuss which techniques and teaching practices they find most helpful, which would produce qualitative data.

Learn How to Put Education Research into Action

Those with an interest in learning how to harness research to develop innovative ideas to improve education systems may want to consider pursuing a doctoral degree. American University’s School of Education online offers a Doctor of Education (EdD) in Education Policy and Leadership that prepares future educators, school administrators, and other education professionals to become leaders who effect positive changes in schools. Courses such as Applied Research Methods I: Enacting Critical Research provides students with the techniques and research skills needed to begin conducting research exploring new ways to enhance education. Learn more about American’ University’s EdD in Education Policy and Leadership .

What’s the Difference Between Educational Equity and Equality?

EdD vs. PhD in Education: Requirements, Career Outlook, and Salary

Top Education Technology Jobs for Doctorate in Education Graduates

American University, EdD in Education Policy and Leadership

Edutopia, “2019 Education Research Highlights”

Formplus, “Qualitative vs. Quantitative Data: 15 Key Differences and Similarities”

iMotion, “Qualitative vs. Quantitative Research: What Is What?”

Scribbr, “Qualitative vs. Quantitative Research”

Simply Psychology, “What’s the Difference Between Quantitative and Qualitative Research?”

Typeform, “A Simple Guide to Qualitative and Quantitative Research”

Request Information

  • Subject List
  • Take a Tour
  • For Authors
  • Subscriber Services
  • Publications
  • African American Studies
  • African Studies
  • American Literature
  • Anthropology
  • Architecture Planning and Preservation
  • Art History
  • Atlantic History
  • Biblical Studies
  • British and Irish Literature
  • Childhood Studies
  • Chinese Studies
  • Cinema and Media Studies
  • Communication
  • Criminology
  • Environmental Science
  • Evolutionary Biology
  • International Law
  • International Relations
  • Islamic Studies
  • Jewish Studies
  • Latin American Studies
  • Latino Studies
  • Linguistics
  • Literary and Critical Theory
  • Medieval Studies
  • Military History
  • Political Science
  • Public Health
  • Renaissance and Reformation
  • Social Work
  • Urban Studies
  • Victorian Literature
  • Browse All Subjects

How to Subscribe

  • Free Trials

In This Article Expand or collapse the "in this article" section Quantitative Research Designs in Educational Research

Introduction, general overviews.

  • Survey Research Designs
  • Correlational Designs
  • Other Nonexperimental Designs
  • Randomized Experimental Designs
  • Quasi-Experimental Designs
  • Single-Case Designs
  • Single-Case Analyses

Related Articles Expand or collapse the "related articles" section about

About related articles close popup.

Lorem Ipsum Sit Dolor Amet

Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Aliquam ligula odio, euismod ut aliquam et, vestibulum nec risus. Nulla viverra, arcu et iaculis consequat, justo diam ornare tellus, semper ultrices tellus nunc eu tellus.

  • Methodologies for Conducting Education Research
  • Mixed Methods Research
  • Multivariate Research Methodology
  • Qualitative Data Analysis Techniques
  • Qualitative, Quantitative, and Mixed Methods Research Sampling Strategies
  • Researcher Development and Skills Training within the Context of Postgraduate Programs
  • Single-Subject Research Design
  • Social Network Analysis
  • Statistical Assumptions

Other Subject Areas

Forthcoming articles expand or collapse the "forthcoming articles" section.

  • English as an International Language for Academic Publishing
  • Girls' Education in the Developing World
  • History of Education in Europe
  • Find more forthcoming articles...
  • Export Citations
  • Share This Facebook LinkedIn Twitter

Quantitative Research Designs in Educational Research by James H. McMillan , Richard S. Mohn , Micol V. Hammack LAST REVIEWED: 24 July 2013 LAST MODIFIED: 24 July 2013 DOI: 10.1093/obo/9780199756810-0113

The field of education has embraced quantitative research designs since early in the 20th century. The foundation for these designs was based primarily in the psychological literature, and psychology and the social sciences more generally continued to have a strong influence on quantitative designs until the assimilation of qualitative designs in the 1970s and 1980s. More recently, a renewed emphasis on quasi-experimental and nonexperimental quantitative designs to infer causal conclusions has resulted in many newer sources specifically targeting these approaches to the field of education. This bibliography begins with a discussion of general introductions to all quantitative designs in the educational literature. The sources in this section tend to be textbooks or well-known sources written many years ago, though still very relevant and helpful. It should be noted that there are many other sources in the social sciences more generally that contain principles of quantitative designs that are applicable to education. This article then classifies quantitative designs primarily as either nonexperimental or experimental but also emphasizes the use of nonexperimental designs for making causal inferences. Among experimental designs the article distinguishes between those that include random assignment of subjects, those that are quasi-experimental (with no random assignment), and those that are single-case (single-subject) designs. Quasi-experimental and nonexperimental designs used for making causal inferences are becoming more popular in education given the practical difficulties and expense in conducting well-controlled experiments, particularly with the use of structural equation modeling (SEM). There have also been recent developments in statistical analyses that allow stronger causal inferences. Historically, quantitative designs have been tied closely to sampling, measurement, and statistics. In this bibliography there are important sources for newer statistical procedures that are needed for particular designs, especially single-case designs, but relatively little attention to sampling or measurement. The literature on quantitative designs in education is not well focused or comprehensively addressed in very many sources, except in general overview textbooks. Those sources that do include the range of designs are introductory in nature; more advanced designs and statistical analyses tend to be found in journal articles and other individual documents, with a couple exceptions. Another new trend in educational research designs is the use of mixed-method designs (both quantitative and qualitative), though this article does not emphasize these designs.

For many years there have been textbooks that present the range of quantitative research designs, both in education and the social sciences more broadly. Indeed, most of the quantitative design research principles are much the same for education, psychology, and other social sciences. These sources provide an introduction to basic designs that are used within the broader context of other educational research methodologies such as qualitative and mixed-method. Examples of these textbooks written specifically for education include Johnson and Christensen 2012 ; Mertens 2010 ; Arthur, et al. 2012 ; and Creswell 2012 . An example of a similar text written for the social sciences, including education that is dedicated only to quantitative research, is Gliner, et al. 2009 . In these texts separate chapters are devoted to different types of quantitative designs. For example, Creswell 2012 contains three quantitative design chapters—experimental, which includes both randomized and quasi-experimental designs; correlational (nonexperimental); and survey (also nonexperimental). Johnson and Christensen 2012 also includes three quantitative design chapters, with greater emphasis on quasi-experimental and single-subject research. Mertens 2010 includes a chapter on causal-comparative designs (nonexperimental). Often survey research is addressed as a distinct type of quantitative research with an emphasis on sampling and measurement (how to design surveys). Green, et al. 2006 also presents introductory chapters on different types of quantitative designs, but each of the chapters has different authors. In this book chapters extend basic designs by examining in greater detail nonexperimental methodologies structured for causal inferences and scaled-up experiments. Two additional sources are noted because they represent the types of publications for the social sciences more broadly that discuss many of the same principles of quantitative design among other types of designs. Bickman and Rog 2009 uses different chapter authors to cover topics such as statistical power for designs, sampling, randomized controlled trials, and quasi-experiments, and educational researchers will find this information helpful in designing their studies. Little 2012 provides a comprehensive coverage of topics related to quantitative methods in the social, behavioral, and education fields.

Arthur, James, Michael Waring, Robert Coe, and Larry V. Hedges, eds. 2012. Research methods & methodologies in education . Thousand Oaks, CA: SAGE.

Readers will find this book more of a handbook than a textbook. Different individuals author each of the chapters, representing quantitative, qualitative, and mixed-method designs. The quantitative chapters are on the treatment of advanced statistical applications, including analysis of variance, regression, and multilevel analysis.

Bickman, Leonard, and Debra J. Rog, eds. 2009. The SAGE handbook of applied social research methods . 2d ed. Thousand Oaks, CA: SAGE.

This handbook includes quantitative design chapters that are written for the social sciences broadly. There are relatively advanced treatments of statistical power, randomized controlled trials, and sampling in quantitative designs, though the coverage of additional topics is not as complete as other sources in this section.

Creswell, John W. 2012. Educational research: Planning, conducting, and evaluating quantitative and qualitative research . 4th ed. Boston: Pearson.

Creswell presents an introduction to all major types of research designs. Three chapters cover quantitative designs—experimental, correlational, and survey research. Both the correlational and survey research chapters focus on nonexperimental designs. Overall the introductions are complete and helpful to those beginning their study of quantitative research designs.

Gliner, Jeffrey A., George A. Morgan, and Nancy L. Leech. 2009. Research methods in applied settings: An integrated approach to design and analysis . 2d ed. New York: Routledge.

This text, unlike others in this section, is devoted solely to quantitative research. As such, all aspects of quantitative designs are covered. There are separate chapters on experimental, nonexperimental, and single-subject designs and on internal validity, sampling, and data-collection techniques for quantitative studies. The content of the book is somewhat more advanced than others listed in this section and is unique in its quantitative focus.

Green, Judith L., Gregory Camilli, and Patricia B. Elmore, eds. 2006. Handbook of complementary methods in education research . Mahwah, NJ: Lawrence Erlbaum.

Green, Camilli, and Elmore edited forty-six chapters that represent many contemporary issues and topics related to quantitative designs. Written by noted researchers, the chapters cover design experiments, quasi-experimentation, randomized experiments, and survey methods. Other chapters include statistical topics that have relevance for quantitative designs.

Johnson, Burke, and Larry B. Christensen. 2012. Educational research: Quantitative, qualitative, and mixed approaches . 4th ed. Thousand Oaks, CA: SAGE.

This comprehensive textbook of educational research methods includes extensive coverage of qualitative and mixed-method designs along with quantitative designs. Three of twenty chapters focus on quantitative designs (experimental, quasi-experimental, and single-case) and nonexperimental, including longitudinal and retrospective, designs. The level of material is relatively high, and there are introductory chapters on sampling and quantitative analyses.

Little, Todd D., ed. 2012. The Oxford handbook of quantitative methods . Vol. 1, Foundations . New York: Oxford Univ. Press.

This handbook is a relatively advanced treatment of quantitative design and statistical analyses. Multiple authors are used to address strengths and weaknesses of many different issues and methods, including advanced statistical tools.

Mertens, Donna M. 2010. Research and evaluation in education and psychology: Integrating diversity with quantitative, qualitative, and mixed methods . 3d ed. Thousand Oaks, CA: SAGE.

This textbook is an introduction to all types of educational designs and includes four chapters devoted to quantitative research—experimental and quasi-experimental, causal comparative and correlational, survey, and single-case research. The author’s treatment of some topics is somewhat more advanced than texts such as Creswell 2012 , with extensive attention to threats to internal validity for some of the designs.

back to top

Users without a subscription are not able to see the full content on this page. Please subscribe or login .

Oxford Bibliographies Online is available by subscription and perpetual access to institutions. For more information or to contact an Oxford Sales Representative click here .

  • About Education »
  • Meet the Editorial Board »
  • Academic Achievement
  • Academic Audit for Universities
  • Academic Freedom and Tenure in the United States
  • Action Research in Education
  • Adjuncts in Higher Education in the United States
  • Administrator Preparation
  • Adolescence
  • Advanced Placement and International Baccalaureate Courses
  • Advocacy and Activism in Early Childhood
  • African American Racial Identity and Learning
  • Alaska Native Education
  • Alternative Certification Programs for Educators
  • Alternative Schools
  • American Indian Education
  • Animals in Environmental Education
  • Art Education
  • Artificial Intelligence and Learning
  • Assessing School Leader Effectiveness
  • Assessment, Behavioral
  • Assessment, Educational
  • Assessment in Early Childhood Education
  • Assistive Technology
  • Augmented Reality in Education
  • Beginning-Teacher Induction
  • Bilingual Education and Bilingualism
  • Black Undergraduate Women: Critical Race and Gender Perspe...
  • Blended Learning
  • Case Study in Education Research
  • Changing Professional and Academic Identities
  • Character Education
  • Children’s and Young Adult Literature
  • Children's Beliefs about Intelligence
  • Children's Rights in Early Childhood Education
  • Citizenship Education
  • Civic and Social Engagement of Higher Education
  • Classroom Learning Environments: Assessing and Investigati...
  • Classroom Management
  • Coherent Instructional Systems at the School and School Sy...
  • College Admissions in the United States
  • College Athletics in the United States
  • Community Relations
  • Comparative Education
  • Computer-Assisted Language Learning
  • Computer-Based Testing
  • Conceptualizing, Measuring, and Evaluating Improvement Net...
  • Continuous Improvement and "High Leverage" Educational Pro...
  • Counseling in Schools
  • Critical Approaches to Gender in Higher Education
  • Critical Perspectives on Educational Innovation and Improv...
  • Critical Race Theory
  • Crossborder and Transnational Higher Education
  • Cross-National Research on Continuous Improvement
  • Cross-Sector Research on Continuous Learning and Improveme...
  • Cultural Diversity in Early Childhood Education
  • Culturally Responsive Leadership
  • Culturally Responsive Pedagogies
  • Culturally Responsive Teacher Education in the United Stat...
  • Curriculum Design
  • Data Collection in Educational Research
  • Data-driven Decision Making in the United States
  • Deaf Education
  • Desegregation and Integration
  • Design Thinking and the Learning Sciences: Theoretical, Pr...
  • Development, Moral
  • Dialogic Pedagogy
  • Digital Age Teacher, The
  • Digital Citizenship
  • Digital Divides
  • Disabilities
  • Distance Learning
  • Distributed Leadership
  • Doctoral Education and Training
  • Early Childhood Education and Care (ECEC) in Denmark
  • Early Childhood Education and Development in Mexico
  • Early Childhood Education in Aotearoa New Zealand
  • Early Childhood Education in Australia
  • Early Childhood Education in China
  • Early Childhood Education in Europe
  • Early Childhood Education in Sub-Saharan Africa
  • Early Childhood Education in Sweden
  • Early Childhood Education Pedagogy
  • Early Childhood Education Policy
  • Early Childhood Education, The Arts in
  • Early Childhood Mathematics
  • Early Childhood Science
  • Early Childhood Teacher Education
  • Early Childhood Teachers in Aotearoa New Zealand
  • Early Years Professionalism and Professionalization Polici...
  • Economics of Education
  • Education For Children with Autism
  • Education for Sustainable Development
  • Education Leadership, Empirical Perspectives in
  • Education of Native Hawaiian Students
  • Education Reform and School Change
  • Educational Statistics for Longitudinal Research
  • Educator Partnerships with Parents and Families with a Foc...
  • Emotional and Affective Issues in Environmental and Sustai...
  • Emotional and Behavioral Disorders
  • Environmental and Science Education: Overlaps and Issues
  • Environmental Education
  • Environmental Education in Brazil
  • Epistemic Beliefs
  • Equity and Improvement: Engaging Communities in Educationa...
  • Equity, Ethnicity, Diversity, and Excellence in Education
  • Ethical Research with Young Children
  • Ethics and Education
  • Ethics of Teaching
  • Ethnic Studies
  • Evidence-Based Communication Assessment and Intervention
  • Family and Community Partnerships in Education
  • Family Day Care
  • Federal Government Programs and Issues
  • Feminization of Labor in Academia
  • Finance, Education
  • Financial Aid
  • Formative Assessment
  • Future-Focused Education
  • Gender and Achievement
  • Gender and Alternative Education
  • Gender, Power and Politics in the Academy
  • Gender-Based Violence on University Campuses
  • Gifted Education
  • Global Mindedness and Global Citizenship Education
  • Global University Rankings
  • Governance, Education
  • Grounded Theory
  • Growth of Effective Mental Health Services in Schools in t...
  • Higher Education and Globalization
  • Higher Education and the Developing World
  • Higher Education Faculty Characteristics and Trends in the...
  • Higher Education Finance
  • Higher Education Governance
  • Higher Education Graduate Outcomes and Destinations
  • Higher Education in Africa
  • Higher Education in China
  • Higher Education in Latin America
  • Higher Education in the United States, Historical Evolutio...
  • Higher Education, International Issues in
  • Higher Education Management
  • Higher Education Policy
  • Higher Education Research
  • Higher Education Student Assessment
  • High-stakes Testing
  • History of Early Childhood Education in the United States
  • History of Education in the United States
  • History of Technology Integration in Education
  • Homeschooling
  • Inclusion in Early Childhood: Difference, Disability, and ...
  • Inclusive Education
  • Indigenous Education in a Global Context
  • Indigenous Learning Environments
  • Indigenous Students in Higher Education in the United Stat...
  • Infant and Toddler Pedagogy
  • Inservice Teacher Education
  • Integrating Art across the Curriculum
  • Intelligence
  • Intensive Interventions for Children and Adolescents with ...
  • International Perspectives on Academic Freedom
  • Intersectionality and Education
  • Knowledge Development in Early Childhood
  • Leadership Development, Coaching and Feedback for
  • Leadership in Early Childhood Education
  • Leadership Training with an Emphasis on the United States
  • Learning Analytics in Higher Education
  • Learning Difficulties
  • Learning, Lifelong
  • Learning, Multimedia
  • Learning Strategies
  • Legal Matters and Education Law
  • LGBT Youth in Schools
  • Linguistic Diversity
  • Linguistically Inclusive Pedagogy
  • Literacy Development and Language Acquisition
  • Literature Reviews
  • Mathematics Identity
  • Mathematics Instruction and Interventions for Students wit...
  • Mathematics Teacher Education
  • Measurement for Improvement in Education
  • Measurement in Education in the United States
  • Meta-Analysis and Research Synthesis in Education
  • Methodological Approaches for Impact Evaluation in Educati...
  • Mindfulness, Learning, and Education
  • Motherscholars
  • Multiliteracies in Early Childhood Education
  • Multiple Documents Literacy: Theory, Research, and Applica...
  • Museums, Education, and Curriculum
  • Music Education
  • Narrative Research in Education
  • Native American Studies
  • Nonformal and Informal Environmental Education
  • Note-Taking
  • Numeracy Education
  • One-to-One Technology in the K-12 Classroom
  • Online Education
  • Open Education
  • Organizing for Continuous Improvement in Education
  • Organizing Schools for the Inclusion of Students with Disa...
  • Outdoor Play and Learning
  • Outdoor Play and Learning in Early Childhood Education
  • Pedagogical Leadership
  • Pedagogy of Teacher Education, A
  • Performance Objectives and Measurement
  • Performance-based Research Assessment in Higher Education
  • Performance-based Research Funding
  • Phenomenology in Educational Research
  • Philosophy of Education
  • Physical Education
  • Podcasts in Education
  • Policy Context of United States Educational Innovation and...
  • Politics of Education
  • Portable Technology Use in Special Education Programs and ...
  • Post-humanism and Environmental Education
  • Pre-Service Teacher Education
  • Problem Solving
  • Productivity and Higher Education
  • Professional Development
  • Professional Learning Communities
  • Program Evaluation
  • Programs and Services for Students with Emotional or Behav...
  • Psychology Learning and Teaching
  • Psychometric Issues in the Assessment of English Language ...
  • Qualitative, Quantitative, and Mixed Methods Research Samp...
  • Qualitative Research Design
  • Quantitative Research Designs in Educational Research
  • Queering the English Language Arts (ELA) Writing Classroom
  • Race and Affirmative Action in Higher Education
  • Reading Education
  • Refugee and New Immigrant Learners
  • Relational and Developmental Trauma and Schools
  • Relational Pedagogies in Early Childhood Education
  • Reliability in Educational Assessments
  • Religion in Elementary and Secondary Education in the Unit...
  • Researcher Development and Skills Training within the Cont...
  • Research-Practice Partnerships in Education within the Uni...
  • Response to Intervention
  • Restorative Practices
  • Risky Play in Early Childhood Education
  • Scale and Sustainability of Education Innovation and Impro...
  • Scaling Up Research-based Educational Practices
  • School Accreditation
  • School Choice
  • School Culture
  • School District Budgeting and Financial Management in the ...
  • School Improvement through Inclusive Education
  • School Reform
  • Schools, Private and Independent
  • School-Wide Positive Behavior Support
  • Science Education
  • Secondary to Postsecondary Transition Issues
  • Self-Regulated Learning
  • Self-Study of Teacher Education Practices
  • Service-Learning
  • Severe Disabilities
  • Single Salary Schedule
  • Single-sex Education
  • Social Context of Education
  • Social Justice
  • Social Pedagogy
  • Social Science and Education Research
  • Social Studies Education
  • Sociology of Education
  • Standards-Based Education
  • Student Access, Equity, and Diversity in Higher Education
  • Student Assignment Policy
  • Student Engagement in Tertiary Education
  • Student Learning, Development, Engagement, and Motivation ...
  • Student Participation
  • Student Voice in Teacher Development
  • Sustainability Education in Early Childhood Education
  • Sustainability in Early Childhood Education
  • Sustainability in Higher Education
  • Teacher Beliefs and Epistemologies
  • Teacher Collaboration in School Improvement
  • Teacher Evaluation and Teacher Effectiveness
  • Teacher Preparation
  • Teacher Training and Development
  • Teacher Unions and Associations
  • Teacher-Student Relationships
  • Teaching Critical Thinking
  • Technologies, Teaching, and Learning in Higher Education
  • Technology Education in Early Childhood
  • Technology, Educational
  • Technology-based Assessment
  • The Bologna Process
  • The Regulation of Standards in Higher Education
  • Theories of Educational Leadership
  • Three Conceptions of Literacy: Media, Narrative, and Gamin...
  • Tracking and Detracking
  • Traditions of Quality Improvement in Education
  • Transformative Learning
  • Transitions in Early Childhood Education
  • Tribally Controlled Colleges and Universities in the Unite...
  • Understanding the Psycho-Social Dimensions of Schools and ...
  • University Faculty Roles and Responsibilities in the Unite...
  • Using Ethnography in Educational Research
  • Value of Higher Education for Students and Other Stakehold...
  • Virtual Learning Environments
  • Vocational and Technical Education
  • Wellness and Well-Being in Education
  • Women's and Gender Studies
  • Young Children and Spirituality
  • Young Children's Learning Dispositions
  • Young Children's Working Theories
  • Privacy Policy
  • Cookie Policy
  • Legal Notice
  • Accessibility

Powered by:

  • [66.249.64.20|81.177.182.154]
  • 81.177.182.154

quantitative research methods education

Handbook of Quantitative Methods for Educational Research

  • © 2013
  • Timothy Teo 0

University of Auckland, New Zealand

You can also search for this editor in PubMed   Google Scholar

  • This handbook has something for both the beginner and advanced educational researchers.

129k Accesses

149 Citations

This is a preview of subscription content, log in via an institution to check access.

Access this book

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Other ways to access

Licence this eBook for your library

Institutional subscriptions

Table of contents (17 chapters)

Front matter, measurement theory, psychometrics.

  • Mark Wilson, Perman Gochyyev

Classical Test Theory

  • Ze Wang, Steven J. Osterlind

Item Response Theory

  • Xitao Fan, Shaojing Sun

Methods of Analysis

Multiple regression.

  • Ken Kelley, Jocelyn H. H. Bolin

Cluster Analysis

  • Christine Distefano, Diana Mindrila

Multivariate Analysis of Variance

  • Lisa L. Harlow, Sunny R. Duerr

Logistic Regression

  • Brian F. French, Hason C. Immekus, Hsiao-Ju Yen

Exploratory Factor Analysis

  • W. Holmes finch

A Brief Introduction to Hierarchical Linear Modeling

  • Jason W. Osborne, Shevaun D. Neupert

Longitudinal Data Analysis

  • D. Betsy mccoach, John P. Madura, Karen E. Rambohernandez, Ann A. O’connell, Megan E. Welsh

Meta-Analysis

  • Spyros Konstantopoulos

Agent Based Modelling

  • Mauricio Salgado, Nigel Gilbert

Mediation, Moderation & Interaction

  • James Hall, Pamela Sammons

Structural Equation Models

Introduction to confirmatory factor analysis and structural equation modeling.

  • Matthew W. Gallagher, Timothy A. Brown

Testing Measurement and Structural Invariance

  • Daniel A. Sass, Thomas A. Schmitt

Mixture Models in Education

  • George A. Marcoulides, Ronald H. Heck
  • educational research
  • quantitative methods

About this book

Editors and affiliations.

Timothy Teo

Bibliographic Information

Book Title : Handbook of Quantitative Methods for Educational Research

Editors : Timothy Teo

DOI : https://doi.org/10.1007/978-94-6209-404-8

Publisher : SensePublishers Rotterdam

eBook Packages : Humanities, Social Sciences and Law , Education (R0)

Copyright Information : SensePublishers 2013

eBook ISBN : 978-94-6209-404-8 Published: 07 February 2014

Edition Number : 1

Number of Pages : VIII, 404

Topics : Education, general

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

College of Education and Human Development

Department of Educational Psychology

Quantitative methods in education

Solve problems in education through research.

Students in Quantitative Methods in Education engage in the science and practice of educational measurement and statistics, primarily through the development and application of statistical and psychometric methods. All QME students will engage in coursework addressing fundamental topics related to statistics, educational measurement, research methods, and foundations in education (e.g., learning and cognition, social development). Students will also undertake additional coursework and complete a set of milestones that will specialize their knowledge and scholarship in educational measurement or statistics. Upon matriculation, graduates will be equipped to help inform educational policy, practice, and curriculum and—most importantly—help schools and students succeed.

  • Test publishing firms
  • Teaching and research at colleges and universities (PhD only)
  • Research and evaluation centers
  • Public school systems
  • State departments of instruction
  • Private industry

Quote from V.N. Vimal Rao, PhD '23

The strong theoretical and methodological foundation I developed in QME and EPSY supports my research and my mentoring of student researchers, while the teaching experience and knowledge of educational psychology I gained supports my teaching and mentoring of teaching assistants. V.N. Vimal Rao, PhD '23 Teaching Assistant Professor in the Department of Statistics University of Illinois Urbana Champaign

Submit your application for the fall semester following the deadlines below. Note the dates are the same for both MA and PhD applicants.

To be considered for fellowships and departmental financial assistance, application materials must be submitted to the program and the Graduate School by the December 1 deadline. (If you are requesting a waiver for the application fee, the last day to apply is 11/17/24.)

If you're not seeking a fellowship or departmental financial aid, you have until March 1 to submit your application materials. (If you are requesting a waiver for the application fee, the last day to apply is 2/15/25.)

MA curriculum (33 credits)

PhD curriculum (72 credits)

The QME program strives to provide funding opportunities to all incoming students. While we can’t typically guarantee funding, over the last five years, we have been able to fund over 95% of our students that were looking for funding (including our MA students)!

Visit the College of Education and Human Development's Finance and Funding page for information on tuition.

Fellowships and awards

Submit your application materials by the December 1 deadline, and you’ll automatically be considered for Graduate School fellowships and departmental awards based on scholastic achievement. Notification of awards will be sent in March.

Note: Spring, summer, and fall (March deadline) applicants will not qualify for fellowships.

Graduate assistantships

Get paid to work as a teaching assistant, graduate instructor or research assistant. Graduate assistantships are available through the department, College of Education and Human Development, and the University.

  • John P. Yackel/Pearson Graduate Internship
  • Jack Merwin Graduate Assistantship
  • All University of Minnesota graduate assistantships

Note: Applicants who complete their applications by the March 1 deadline will be less likely to receive graduate assistantships than students who meet the December 1 deadline.

Additional funding

Visit the College of Education and Human Development's Finance and Funding page for more information on funding.

Financial aid

Visit OneStop Student Services for more information on available financial aid.

The Department of Educational Psychology offers a minor in educational psychology with an emphasis in quantitative methods in education.

Quote from Rik Lamm, PhD '23

My background in the QME program has equipped me with the skills necessary for my current role as a Research, Evaluation, and Assessment Scientist for Bloomington Public Schools. These include developing non-cognitive surveys such as student climate surveys and parent engagement surveys, as well as analyzing data from academic assessments such as the MCAs. Additionally, QME has equipped me with the skills to interpret complex data in order to predict longitudinal trends. This ability leads to the development of research-driven strategies that benefit both students and teachers. Rik Lamm, PhD '23 Research, Evaluation, and Assessment Scientist Bloomington Public Schools

Faculty and staff

Assistant Professor

Assistant professor

Nidhi Kohli

Royal and Virginia Anderson Professor of Quantitative Methods in Education; Program Coordinator

Chelsey Legacy

Teaching assistant professor

Suzanne Loch

Senior teaching specialist

Michael Rodriguez

CEHD Dean; Campbell Leadership Chair in Education and Human Development; co-founding director of Educational Equity Resource Center

Andrew Zieffler

Teaching professor

Adjunct faculty and program affiliates

Adjunct faculty, claudio violato.

Assistant dean, Medical School

Program affiliates

Adam rothman.

Associate professor, School of Statistics  

Quote from José Palma, PhD '21

It is the combination of psychometric research and applied focus, in addition to knowledge gained from my academic journey, that makes me a competitive and atypical educational measurement researcher today. José Palma, PhD '21 ACES Faculty Fellow and Assistant Professor Texas A&M University

Corissa Rohloff, PhD student in Ed Psych, awarded Russell W. Burris Memorial Fellowship

Corissa Rohloff, PhD candidate in the quantitative methods in education program, has been awarded the Russell W. Burris Memorial Fellowship.

QME recognizes students in year end celebration

Students in the Department of Educational Psychology’s quantitative methods in education (QME) program were recognized for their contributions to the program during the 2022-23 academic year.

Kohli speaks at two international conferences

Dr. Nidhi Kohli, Royal and Virginia Anderson Professor of Quantitative Methods in Education (QME) and Program Coordinator for the QME program in the Department of Educational Psychology, was invited to present at two conferences this summer: the International Meeting of Psychometric Society and the International Indian Statistical Association.

  • Technical Support
  • Find My Rep

You are here

Introduction to Educational Research

Introduction to Educational Research

  • Craig A. Mertler - Barry University, USA
  • Description

This Third Edition of Craig Mertler’s practical text helps readers every step of the way as they plan and execute their first educational research project. Offering balanced coverage of qualitative and quantitative methods, an emphasis on ethics, and a wealth of new examples and concrete applications, the new edition continues to use conversational, nontechnical language to help students clearly understand and apply research concepts, principles, procedures, and terminology. Expanded coverage of foundations of research, an increased focus on integrating qualitative and quantitative research, and updated coverage of research questions and the tools of qualitative research bring the book thoroughly up-to-date, while streamlined coverage of statistics shows students how to do quantitative analysis in a straightforward way.

See what’s new to this edition by selecting the Features tab on this page. Should you need additional information or have questions regarding the HEOA information provided for this title, including what is new to this edition, please email [email protected] . Please include your name, contact information, and the name of the title for which you would like more information. For information on the HEOA, please go to http://ed.gov/policy/highered/leg/hea08/index.html .

For assistance with your order: Please email us at [email protected] or connect with your SAGE representative.

SAGE 2455 Teller Road Thousand Oaks, CA 91320 www.sagepub.com

Supplements

Not what the dept. is looking for

It was easy to read and well organised. I felt this would help Year 3 undergraduates with their dissertation.

I've used Mertler (author) previously in this introductory research graduate course. The table of contents is logically organized. The content is geared towards an introduction course to research process and options. Access to additional resources (test bank, slides, and tables & figures) helps to make the D2L online platform easier. Objectives are outlined at the beginning of each chapter.

A comprehensive textbook that describes everything about research methodology in education.

Accessible writing for inservice teachers.

  • More in-depth coverage of the scientific method, research questions, and literature reviews gives students a solid grounding in the research process.
  • New and expanded coverage of stating and refining research questions, the features and qualities of good research questions, prepare readers to successfully plan and execute an educational research project.
  • Updated and expanded coverage of the history of research ethics involving human research participants and tips for completing IRB applications further emphasize the importance of ethics in all research projects. Incorporates the latest changes in the 7th edition of the APA Manual , including coverage of research report writing, bias-free writing, self-plagiarism, and citation style.
  • New research articles appear in appendices to demonstrate qualitative, quantitative, and mixed methods and include annotations that highlight different parts of the research process.
  • Strong practical coverage of action research includes new discussions of problems of practice in action research and using action research as a means to pursue organizational change.
  • Student Learning Objectives (SLOs) list the four to eight major targeted learning objectives for that chapter and are ideal for review.
  • Chapter-ending Developmental Activities (five per chapter) give students opportunities to apply concepts and skills and can be used as course assignments, in-class activities, and discussions.
  • Chapter Summaries listed in bullet-point format provide focused reviews of chapter content.
  • Annotated published Research Reports in the appendices contain complete published survey research, qualitative research, and quantitative research articles which allow students to engage in critiques of three published articles, as well as to see different formats and writing styles appropriate for academic journals.
  • A glossary of more than 350 terms —highlighted in boldface when first introduced—offers access to one of the most comprehensive glossaries presented in any educational research textbook.

Sample Materials & Chapters

Chapter1 - What is Educational Research

Chapter2 - Overview of the Educational Research Process

For instructors

Select a purchasing option, related products.

Educational Research

quantitative research methods education

PhD in Educational Psychology: Quantitative Methods

Educational research has a strong tradition of employing state-of-the-art statistical and psychometric (psychological measurement) techniques. Researchers in all areas of education develop measuring instruments, design and conduct experiments and surveys, and analyze data resulting from these activities. Because of this tradition, quantitative methods has long been an area of specialization within educational psychology. Graduates in this area teach, serve as consultants to educational researchers, and conduct research on statistics and psychometrics in education-related fields. Within the program, the quantitative methods area offers the two major specializations of statistics and measurement.

The study of quantitative methods takes advantage of the range of resources at the University of Wisconsin–Madison and includes coursework in statistics, mathematics, and computer sciences, and in other units of the School of Education.

quantitative research methods education

Program Area Overview

quantitative research methods education

Faculty Research

quantitative research methods education

Current Students

quantitative research methods education

Search this site

College of education menu, college of education.

Golden Hour Lokey

Quantitative Research Methods in Education Program

Program Options    |    Information Session    |    Specialization in Educational Data Science    |    Social System Data Science    |    Apply Now

Applications to this program are accepted every other year. We will be accepting applications in January 2025 for the cohort entering Fall 2025 and in January 2027 for the cohort entering Fall 2027. Please see our  apply  page for instructions and exact deadlines.

View Our QRME Brochure

Why earn a graduate degree in quantitative research methods in education?

Our PhD program in quantitative research methods in education (QRME) develops researchers, scholars, and policy leaders who engage in traditions of inquiry that create knowledge and understanding founded in empirical evidence.

The program focuses on:

  • Building a strong understanding of quantitative methods in research
  • Studying causal and complex relationships within applied educational, social, and institutional settings
  • Using research to inform policy and practice in applied educational and social settings

Ideal QRME PhD students:

  • Understand the importance of evidence and data in making decisions
  • Take an interdisciplinary and systems-level perspective to problems in education
  • Value the union of rigorous research methodology with strong grounding in theory and practice

Program Options

The quantitative research methods in education program offers the following degree options:

  • Doctor of Philosophy (PhD)

What can I do with a degree in quantitative research methods in education?

Quantitative research methods in education PhD graduates pursue careers in:

  • Academic institutions
  • Research and evaluation centers
  • Policy institutes
  • Applied settings including state departments of education and school districts

Information Session

Want to learn more?  Please view our virtual information session with former Program Director, Gina Biancarosa, and program faculty member, David Liebowitz. You can learn information about program structure, design, and classes -- and discover some answers to frequently asked questions.  

Enroll in the NEW Specialization in Educational Data Science for Social Systems Leaders

Become a “big data” leader who has the skills to use data to guide decision making. Data-informed decision-making is now the new minimum standard of practice in many fields, including education and social systems.

Specialization in Educational Data Science

A five-course Data Science Specialization is available now in the College of Education at the University of Oregon. Contact Cengiz Zopluoglu [email protected] for information about how the Data Science Specialization can be become an emphasis area of a doctoral or master’s degree.

View Our Specialization in EDS Brochure

University of Oregon Student Conducting Research

Social Systems Data Science

Our Social Systems Data Science (SDS) Network is dedicated to accelerating data-driven research and outreach efforts in the educational, behavioral, and social sciences.

Discover SDS

Contact Information

Keith Zvoch, PhD Professor [email protected]

Kim Boyd Graduate Coordinator [email protected] 541-346-5968

Quantitative research in education : Journals

  • Computers and education "Computers & Education aims to increase knowledge and understanding of ways in which digital technology can enhance education, through the publication of high-quality research, which extends theory and practice."
  • Journal of educational and behavioral statistics "Cosponsored by the American Statistical Association, the Journal of Educational and Behavioral Statistics (JEBS) publishes articles that are original and useful to those applying statistical approaches to problems and issues in educational or behavioral research. Typical papers present new methods of analysis."
  • Research in higher education "Research in Higher Education publishes studies that examine issues pertaining to postsecondary education. The journal is open to studies using a wide range of methods, but has particular interest in studies that apply advanced quantitative research methods to issues in postsecondary education or address postsecondary education policy issues."
  • << Previous: Recent print books
  • Next: Databases >>
  • Background information
  • Recent e-books
  • Recent print books
  • Connect to Stanford e-resources
  • Last Updated: Jan 23, 2024 12:46 PM
  • URL: https://guides.library.stanford.edu/quantitative_research_in_ed
  • Privacy Policy

Research Method

Home » Quantitative Research – Methods, Types and Analysis

Quantitative Research – Methods, Types and Analysis

Table of Contents

What is Quantitative Research

Quantitative Research

Quantitative research is a type of research that collects and analyzes numerical data to test hypotheses and answer research questions . This research typically involves a large sample size and uses statistical analysis to make inferences about a population based on the data collected. It often involves the use of surveys, experiments, or other structured data collection methods to gather quantitative data.

Quantitative Research Methods

Quantitative Research Methods

Quantitative Research Methods are as follows:

Descriptive Research Design

Descriptive research design is used to describe the characteristics of a population or phenomenon being studied. This research method is used to answer the questions of what, where, when, and how. Descriptive research designs use a variety of methods such as observation, case studies, and surveys to collect data. The data is then analyzed using statistical tools to identify patterns and relationships.

Correlational Research Design

Correlational research design is used to investigate the relationship between two or more variables. Researchers use correlational research to determine whether a relationship exists between variables and to what extent they are related. This research method involves collecting data from a sample and analyzing it using statistical tools such as correlation coefficients.

Quasi-experimental Research Design

Quasi-experimental research design is used to investigate cause-and-effect relationships between variables. This research method is similar to experimental research design, but it lacks full control over the independent variable. Researchers use quasi-experimental research designs when it is not feasible or ethical to manipulate the independent variable.

Experimental Research Design

Experimental research design is used to investigate cause-and-effect relationships between variables. This research method involves manipulating the independent variable and observing the effects on the dependent variable. Researchers use experimental research designs to test hypotheses and establish cause-and-effect relationships.

Survey Research

Survey research involves collecting data from a sample of individuals using a standardized questionnaire. This research method is used to gather information on attitudes, beliefs, and behaviors of individuals. Researchers use survey research to collect data quickly and efficiently from a large sample size. Survey research can be conducted through various methods such as online, phone, mail, or in-person interviews.

Quantitative Research Analysis Methods

Here are some commonly used quantitative research analysis methods:

Statistical Analysis

Statistical analysis is the most common quantitative research analysis method. It involves using statistical tools and techniques to analyze the numerical data collected during the research process. Statistical analysis can be used to identify patterns, trends, and relationships between variables, and to test hypotheses and theories.

Regression Analysis

Regression analysis is a statistical technique used to analyze the relationship between one dependent variable and one or more independent variables. Researchers use regression analysis to identify and quantify the impact of independent variables on the dependent variable.

Factor Analysis

Factor analysis is a statistical technique used to identify underlying factors that explain the correlations among a set of variables. Researchers use factor analysis to reduce a large number of variables to a smaller set of factors that capture the most important information.

Structural Equation Modeling

Structural equation modeling is a statistical technique used to test complex relationships between variables. It involves specifying a model that includes both observed and unobserved variables, and then using statistical methods to test the fit of the model to the data.

Time Series Analysis

Time series analysis is a statistical technique used to analyze data that is collected over time. It involves identifying patterns and trends in the data, as well as any seasonal or cyclical variations.

Multilevel Modeling

Multilevel modeling is a statistical technique used to analyze data that is nested within multiple levels. For example, researchers might use multilevel modeling to analyze data that is collected from individuals who are nested within groups, such as students nested within schools.

Applications of Quantitative Research

Quantitative research has many applications across a wide range of fields. Here are some common examples:

  • Market Research : Quantitative research is used extensively in market research to understand consumer behavior, preferences, and trends. Researchers use surveys, experiments, and other quantitative methods to collect data that can inform marketing strategies, product development, and pricing decisions.
  • Health Research: Quantitative research is used in health research to study the effectiveness of medical treatments, identify risk factors for diseases, and track health outcomes over time. Researchers use statistical methods to analyze data from clinical trials, surveys, and other sources to inform medical practice and policy.
  • Social Science Research: Quantitative research is used in social science research to study human behavior, attitudes, and social structures. Researchers use surveys, experiments, and other quantitative methods to collect data that can inform social policies, educational programs, and community interventions.
  • Education Research: Quantitative research is used in education research to study the effectiveness of teaching methods, assess student learning outcomes, and identify factors that influence student success. Researchers use experimental and quasi-experimental designs, as well as surveys and other quantitative methods, to collect and analyze data.
  • Environmental Research: Quantitative research is used in environmental research to study the impact of human activities on the environment, assess the effectiveness of conservation strategies, and identify ways to reduce environmental risks. Researchers use statistical methods to analyze data from field studies, experiments, and other sources.

Characteristics of Quantitative Research

Here are some key characteristics of quantitative research:

  • Numerical data : Quantitative research involves collecting numerical data through standardized methods such as surveys, experiments, and observational studies. This data is analyzed using statistical methods to identify patterns and relationships.
  • Large sample size: Quantitative research often involves collecting data from a large sample of individuals or groups in order to increase the reliability and generalizability of the findings.
  • Objective approach: Quantitative research aims to be objective and impartial in its approach, focusing on the collection and analysis of data rather than personal beliefs, opinions, or experiences.
  • Control over variables: Quantitative research often involves manipulating variables to test hypotheses and establish cause-and-effect relationships. Researchers aim to control for extraneous variables that may impact the results.
  • Replicable : Quantitative research aims to be replicable, meaning that other researchers should be able to conduct similar studies and obtain similar results using the same methods.
  • Statistical analysis: Quantitative research involves using statistical tools and techniques to analyze the numerical data collected during the research process. Statistical analysis allows researchers to identify patterns, trends, and relationships between variables, and to test hypotheses and theories.
  • Generalizability: Quantitative research aims to produce findings that can be generalized to larger populations beyond the specific sample studied. This is achieved through the use of random sampling methods and statistical inference.

Examples of Quantitative Research

Here are some examples of quantitative research in different fields:

  • Market Research: A company conducts a survey of 1000 consumers to determine their brand awareness and preferences. The data is analyzed using statistical methods to identify trends and patterns that can inform marketing strategies.
  • Health Research : A researcher conducts a randomized controlled trial to test the effectiveness of a new drug for treating a particular medical condition. The study involves collecting data from a large sample of patients and analyzing the results using statistical methods.
  • Social Science Research : A sociologist conducts a survey of 500 people to study attitudes toward immigration in a particular country. The data is analyzed using statistical methods to identify factors that influence these attitudes.
  • Education Research: A researcher conducts an experiment to compare the effectiveness of two different teaching methods for improving student learning outcomes. The study involves randomly assigning students to different groups and collecting data on their performance on standardized tests.
  • Environmental Research : A team of researchers conduct a study to investigate the impact of climate change on the distribution and abundance of a particular species of plant or animal. The study involves collecting data on environmental factors and population sizes over time and analyzing the results using statistical methods.
  • Psychology : A researcher conducts a survey of 500 college students to investigate the relationship between social media use and mental health. The data is analyzed using statistical methods to identify correlations and potential causal relationships.
  • Political Science: A team of researchers conducts a study to investigate voter behavior during an election. They use survey methods to collect data on voting patterns, demographics, and political attitudes, and analyze the results using statistical methods.

How to Conduct Quantitative Research

Here is a general overview of how to conduct quantitative research:

  • Develop a research question: The first step in conducting quantitative research is to develop a clear and specific research question. This question should be based on a gap in existing knowledge, and should be answerable using quantitative methods.
  • Develop a research design: Once you have a research question, you will need to develop a research design. This involves deciding on the appropriate methods to collect data, such as surveys, experiments, or observational studies. You will also need to determine the appropriate sample size, data collection instruments, and data analysis techniques.
  • Collect data: The next step is to collect data. This may involve administering surveys or questionnaires, conducting experiments, or gathering data from existing sources. It is important to use standardized methods to ensure that the data is reliable and valid.
  • Analyze data : Once the data has been collected, it is time to analyze it. This involves using statistical methods to identify patterns, trends, and relationships between variables. Common statistical techniques include correlation analysis, regression analysis, and hypothesis testing.
  • Interpret results: After analyzing the data, you will need to interpret the results. This involves identifying the key findings, determining their significance, and drawing conclusions based on the data.
  • Communicate findings: Finally, you will need to communicate your findings. This may involve writing a research report, presenting at a conference, or publishing in a peer-reviewed journal. It is important to clearly communicate the research question, methods, results, and conclusions to ensure that others can understand and replicate your research.

When to use Quantitative Research

Here are some situations when quantitative research can be appropriate:

  • To test a hypothesis: Quantitative research is often used to test a hypothesis or a theory. It involves collecting numerical data and using statistical analysis to determine if the data supports or refutes the hypothesis.
  • To generalize findings: If you want to generalize the findings of your study to a larger population, quantitative research can be useful. This is because it allows you to collect numerical data from a representative sample of the population and use statistical analysis to make inferences about the population as a whole.
  • To measure relationships between variables: If you want to measure the relationship between two or more variables, such as the relationship between age and income, or between education level and job satisfaction, quantitative research can be useful. It allows you to collect numerical data on both variables and use statistical analysis to determine the strength and direction of the relationship.
  • To identify patterns or trends: Quantitative research can be useful for identifying patterns or trends in data. For example, you can use quantitative research to identify trends in consumer behavior or to identify patterns in stock market data.
  • To quantify attitudes or opinions : If you want to measure attitudes or opinions on a particular topic, quantitative research can be useful. It allows you to collect numerical data using surveys or questionnaires and analyze the data using statistical methods to determine the prevalence of certain attitudes or opinions.

Purpose of Quantitative Research

The purpose of quantitative research is to systematically investigate and measure the relationships between variables or phenomena using numerical data and statistical analysis. The main objectives of quantitative research include:

  • Description : To provide a detailed and accurate description of a particular phenomenon or population.
  • Explanation : To explain the reasons for the occurrence of a particular phenomenon, such as identifying the factors that influence a behavior or attitude.
  • Prediction : To predict future trends or behaviors based on past patterns and relationships between variables.
  • Control : To identify the best strategies for controlling or influencing a particular outcome or behavior.

Quantitative research is used in many different fields, including social sciences, business, engineering, and health sciences. It can be used to investigate a wide range of phenomena, from human behavior and attitudes to physical and biological processes. The purpose of quantitative research is to provide reliable and valid data that can be used to inform decision-making and improve understanding of the world around us.

Advantages of Quantitative Research

There are several advantages of quantitative research, including:

  • Objectivity : Quantitative research is based on objective data and statistical analysis, which reduces the potential for bias or subjectivity in the research process.
  • Reproducibility : Because quantitative research involves standardized methods and measurements, it is more likely to be reproducible and reliable.
  • Generalizability : Quantitative research allows for generalizations to be made about a population based on a representative sample, which can inform decision-making and policy development.
  • Precision : Quantitative research allows for precise measurement and analysis of data, which can provide a more accurate understanding of phenomena and relationships between variables.
  • Efficiency : Quantitative research can be conducted relatively quickly and efficiently, especially when compared to qualitative research, which may involve lengthy data collection and analysis.
  • Large sample sizes : Quantitative research can accommodate large sample sizes, which can increase the representativeness and generalizability of the results.

Limitations of Quantitative Research

There are several limitations of quantitative research, including:

  • Limited understanding of context: Quantitative research typically focuses on numerical data and statistical analysis, which may not provide a comprehensive understanding of the context or underlying factors that influence a phenomenon.
  • Simplification of complex phenomena: Quantitative research often involves simplifying complex phenomena into measurable variables, which may not capture the full complexity of the phenomenon being studied.
  • Potential for researcher bias: Although quantitative research aims to be objective, there is still the potential for researcher bias in areas such as sampling, data collection, and data analysis.
  • Limited ability to explore new ideas: Quantitative research is often based on pre-determined research questions and hypotheses, which may limit the ability to explore new ideas or unexpected findings.
  • Limited ability to capture subjective experiences : Quantitative research is typically focused on objective data and may not capture the subjective experiences of individuals or groups being studied.
  • Ethical concerns : Quantitative research may raise ethical concerns, such as invasion of privacy or the potential for harm to participants.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Questionnaire

Questionnaire – Definition, Types, and Examples

Case Study Research

Case Study – Methods, Examples and Guide

Observational Research

Observational Research – Methods and Guide

Qualitative Research Methods

Qualitative Research Methods

Explanatory Research

Explanatory Research – Types, Methods, Guide

Survey Research

Survey Research – Types, Methods, Examples

  • Skip to Content
  • Catalog Home
  • University of Oregon Home

Site navigation

  • Undergraduate Programs
  • Graduate Programs
  • Core-Education Courses
  • Apply to the UO

2023-2024 Catalog

  • College of Education >
  • Education Studies >

Quantitative Research Methods in Education (PhD)

  • Counseling Psychology and Human Services
  • Educational Foundations (BA/​BS)
  • Educational Foundations Secondary Certificate
  • Curriculum and Teaching (UOTeach) (MEd)
  • Curriculum Teacher Education (MS)
  • Educational Policy and Leadership (MS)
  • Critical and Sociocultural Studies in Education (PhD)
  • Education Studies Graduate Specializations
  • Special Education and Clinical Sciences

Our PhD program in quantitative research methods in education (QRME) develops researchers, scholars, and policy leaders who engage in traditions of inquiry that create knowledge and understanding founded in empirical evidence.

The program focuses on:

  • Building a strong understanding of quantitative methods in research
  • Studying causal and complex relationships within applied educational, social, and institutional settings
  • Using research to inform policy and practice in applied educational and social settings

Ideal QRME PhD students:

  • Understand the importance of evidence and data in making decisions
  • Take an interdisciplinary and systems-level perspective to problems in education
  • Value the union of rigorous research methodology with strong grounding in theory and practice

For more information about the PHD in QRME Program, please visit the Program's website at:  Quantitative Research Methods in Education Program | College of Education (uoregon.edu)  

For application instructions, please see:   Apply to the Quantitative Research Methods in Education Program | College of Education (uoregon.edu)  

Program Learning Outcomes

Upon successful completion of this program, students will be able to:

  • Meet professional standards by being able to read and critically consume and analyze research; translate research into practices, decisions, and policy; and understand and employ professional standards for equity, fairness, ethical treatment of human subjects, and principled leadership in research and practice.
  • Engage in scholarly communications by demonstrating effective oral and written communication skills when working with diverse partners and communicating research findings; and by demonstrating knowledge and understanding of diversity issues, organizational identity, and strategic communication.
  • Engage in educational inquiry by gathering and applying empirical evidence in practice; understanding how to conduct and supervise field-based research; and employing conceptual frameworks and methodological approaches appropriate for the line of inquiry.

Office of Admissions 1217 University of Oregon, Eugene, OR 97403-1217

  • Accessibility
  • Report a Concern
  • Privacy Policy
  • Find People
  • © University of Oregon . All Rights Reserved

UO prohibits discrimination on the basis of race, color, sex, national or ethnic origin, age, religion, marital status, disability, veteran status, sexual orientation, gender identity, and gender expression in all programs, activities and employment practices as required by Title IX, other applicable laws, and policies. Retaliation is prohibited by UO policy. Questions may be referred to the Title IX Coordinator, Office of Affirmative Action and Equal Opportunity, or to the Office for Civil Rights. Contact information, related policies, and complaint procedures are listed on the statement of non-discrimination .

Print this page.

The PDF will include all information unique to this page.

Our research aims to measure the impact of educational interventions and programs, identify the factors that contribute to student success, develop new educational policies and practices, and evaluate the effectiveness of educational systems.

Robert Abbott

Robert Abbott

Oscar L. Olvera Astivia

Oscar L. Olvera Astivia

Ramon Concepcion

Ramon Concepcion

Carol Davis

Carol Davis

Ismael Fajardo

Ismael Fajardo

David Knight

David Knight

Min Li

Elizabeth Sanders

Ilene Schwartz

Ilene Schwartz

Scott Spaulding

Scott Spaulding

Min Sun

Advertisement

Issue Cover

  • Previous Issue
  • Previous Article
  • Next Article

Clarifying the Research Purpose

Methodology, measurement, data analysis and interpretation, tools for evaluating the quality of medical education research, research support, competing interests, quantitative research methods in medical education.

Submitted for publication January 8, 2018. Accepted for publication November 29, 2018.

  • Split-Screen
  • Article contents
  • Figures & tables
  • Supplementary Data
  • Peer Review
  • Open the PDF for in another window
  • Cite Icon Cite
  • Get Permissions
  • Search Site

John T. Ratelle , Adam P. Sawatsky , Thomas J. Beckman; Quantitative Research Methods in Medical Education. Anesthesiology 2019; 131:23–35 doi: https://doi.org/10.1097/ALN.0000000000002727

Download citation file:

  • Ris (Zotero)
  • Reference Manager

There has been a dramatic growth of scholarly articles in medical education in recent years. Evaluating medical education research requires specific orientation to issues related to format and content. Our goal is to review the quantitative aspects of research in medical education so that clinicians may understand these articles with respect to framing the study, recognizing methodologic issues, and utilizing instruments for evaluating the quality of medical education research. This review can be used both as a tool when appraising medical education research articles and as a primer for clinicians interested in pursuing scholarship in medical education.

Image: J. P. Rathmell and Terri Navarette.

Image: J. P. Rathmell and Terri Navarette.

There has been an explosion of research in the field of medical education. A search of PubMed demonstrates that more than 40,000 articles have been indexed under the medical subject heading “Medical Education” since 2010, which is more than the total number of articles indexed under this heading in the 1980s and 1990s combined. Keeping up to date requires that practicing clinicians have the skills to interpret and appraise the quality of research articles, especially when serving as editors, reviewers, and consumers of the literature.

While medical education shares many characteristics with other biomedical fields, substantial particularities exist. We recognize that practicing clinicians may not be familiar with the nuances of education research and how to assess its quality. Therefore, our purpose is to provide a review of quantitative research methodologies in medical education. Specifically, we describe a structure that can be used when conducting or evaluating medical education research articles.

Clarifying the research purpose is an essential first step when reading or conducting scholarship in medical education. 1   Medical education research can serve a variety of purposes, from advancing the science of learning to improving the outcomes of medical trainees and the patients they care for. However, a well-designed study has limited value if it addresses vague, redundant, or unimportant medical education research questions.

What is the research topic and why is it important? What is unknown about the research topic? Why is further research necessary?

What is the conceptual framework being used to approach the study?

What is the statement of study intent?

What are the research methodology and study design? Are they appropriate for the study objective(s)?

Which threats to internal validity are most relevant for the study?

What is the outcome and how was it measured?

Can the results be trusted? What is the validity and reliability of the measurements?

How were research subjects selected? Is the research sample representative of the target population?

Was the data analysis appropriate for the study design and type of data?

What is the effect size? Do the results have educational significance?

Fortunately, there are steps to ensure that the purpose of a research study is clear and logical. Table 1   2–5   outlines these steps, which will be described in detail in the following sections. We describe these elements not as a simple “checklist,” but as an advanced organizer that can be used to understand a medical education research study. These steps can also be used by clinician educators who are new to the field of education research and who wish to conduct scholarship in medical education.

Steps in Clarifying the Purpose of a Research Study in Medical Education

Steps in Clarifying the Purpose of a Research Study in Medical Education

Literature Review and Problem Statement

A literature review is the first step in clarifying the purpose of a medical education research article. 2 , 5 , 6   When conducting scholarship in medical education, a literature review helps researchers develop an understanding of their topic of interest. This understanding includes both existing knowledge about the topic as well as key gaps in the literature, which aids the researcher in refining their study question. Additionally, a literature review helps researchers identify conceptual frameworks that have been used to approach the research topic. 2  

When reading scholarship in medical education, a successful literature review provides background information so that even someone unfamiliar with the research topic can understand the rationale for the study. Located in the introduction of the manuscript, the literature review guides the reader through what is already known in a manner that highlights the importance of the research topic. The literature review should also identify key gaps in the literature so the reader can understand the need for further research. This gap description includes an explicit problem statement that summarizes the important issues and provides a reason for the study. 2 , 4   The following is one example of a problem statement:

“Identifying gaps in the competency of anesthesia residents in time for intervention is critical to patient safety and an effective learning system… [However], few available instruments relate to complex behavioral performance or provide descriptors…that could inform subsequent feedback, individualized teaching, remediation, and curriculum revision.” 7  

This problem statement articulates the research topic (identifying resident performance gaps), why it is important (to intervene for the sake of learning and patient safety), and current gaps in the literature (few tools are available to assess resident performance). The researchers have now underscored why further research is needed and have helped readers anticipate the overarching goals of their study (to develop an instrument to measure anesthesiology resident performance). 4  

The Conceptual Framework

Following the literature review and articulation of the problem statement, the next step in clarifying the research purpose is to select a conceptual framework that can be applied to the research topic. Conceptual frameworks are “ways of thinking about a problem or a study, or ways of representing how complex things work.” 3   Just as clinical trials are informed by basic science research in the laboratory, conceptual frameworks often serve as the “basic science” that informs scholarship in medical education. At a fundamental level, conceptual frameworks provide a structured approach to solving the problem identified in the problem statement.

Conceptual frameworks may take the form of theories, principles, or models that help to explain the research problem by identifying its essential variables or elements. Alternatively, conceptual frameworks may represent evidence-based best practices that researchers can apply to an issue identified in the problem statement. 3   Importantly, there is no single best conceptual framework for a particular research topic, although the choice of a conceptual framework is often informed by the literature review and knowing which conceptual frameworks have been used in similar research. 8   For further information on selecting a conceptual framework for research in medical education, we direct readers to the work of Bordage 3   and Irby et al. 9  

To illustrate how different conceptual frameworks can be applied to a research problem, suppose you encounter a study to reduce the frequency of communication errors among anesthesiology residents during day-to-night handoff. Table 2 10 , 11   identifies two different conceptual frameworks researchers might use to approach the task. The first framework, cognitive load theory, has been proposed as a conceptual framework to identify potential variables that may lead to handoff errors. 12   Specifically, cognitive load theory identifies the three factors that affect short-term memory and thus may lead to communication errors:

Conceptual Frameworks to Address the Issue of Handoff Errors in the Intensive Care Unit

Conceptual Frameworks to Address the Issue of Handoff Errors in the Intensive Care Unit

Intrinsic load: Inherent complexity or difficulty of the information the resident is trying to learn ( e.g. , complex patients).

Extraneous load: Distractions or demands on short-term memory that are not related to the information the resident is trying to learn ( e.g. , background noise, interruptions).

Germane load: Effort or mental strategies used by the resident to organize and understand the information he/she is trying to learn ( e.g. , teach back, note taking).

Using cognitive load theory as a conceptual framework, researchers may design an intervention to reduce extraneous load and help the resident remember the overnight to-do’s. An example might be dedicated, pager-free handoff times where distractions are minimized.

The second framework identified in table 2 , the I-PASS (Illness severity, Patient summary, Action list, Situational awareness and contingency planning, and Synthesis by receiver) handoff mnemonic, 11   is an evidence-based best practice that, when incorporated as part of a handoff bundle, has been shown to reduce handoff errors on pediatric wards. 13   Researchers choosing this conceptual framework may adapt some or all of the I-PASS elements for resident handoffs in the intensive care unit.

Note that both of the conceptual frameworks outlined above provide researchers with a structured approach to addressing the issue of handoff errors; one is not necessarily better than the other. Indeed, it is possible for researchers to use both frameworks when designing their study. Ultimately, we provide this example to demonstrate the necessity of selecting conceptual frameworks to clarify the research purpose. 3 , 8   Readers should look for conceptual frameworks in the introduction section and should be wary of their omission, as commonly seen in less well-developed medical education research articles. 14  

Statement of Study Intent

After reviewing the literature, articulating the problem statement, and selecting a conceptual framework to address the research topic, the final step in clarifying the research purpose is the statement of study intent. The statement of study intent is arguably the most important element of framing the study because it makes the research purpose explicit. 2   Consider the following example:

This study aimed to test the hypothesis that the introduction of the BASIC Examination was associated with an accelerated knowledge acquisition during residency training, as measured by increments in annual ITE scores. 15  

This statement of study intent succinctly identifies several key study elements including the population (anesthesiology residents), the intervention/independent variable (introduction of the BASIC Examination), the outcome/dependent variable (knowledge acquisition, as measure by in In-training Examination [ITE] scores), and the hypothesized relationship between the independent and dependent variable (the authors hypothesize a positive correlation between the BASIC examination and the speed of knowledge acquisition). 6 , 14  

The statement of study intent will sometimes manifest as a research objective, rather than hypothesis or question. In such instances there may not be explicit independent and dependent variables, but the study population and research aim should be clearly identified. The following is an example:

“In this report, we present the results of 3 [years] of course data with respect to the practice improvements proposed by participating anesthesiologists and their success in implementing those plans. Specifically, our primary aim is to assess the frequency and type of improvements that were completed and any factors that influence completion.” 16  

The statement of study intent is the logical culmination of the literature review, problem statement, and conceptual framework, and is a transition point between the Introduction and Methods sections of a medical education research report. Nonetheless, a systematic review of experimental research in medical education demonstrated that statements of study intent are absent in the majority of articles. 14   When reading a medical education research article where the statement of study intent is absent, it may be necessary to infer the research aim by gathering information from the Introduction and Methods sections. In these cases, it can be useful to identify the following key elements 6 , 14 , 17   :

Population of interest/type of learner ( e.g. , pain medicine fellow or anesthesiology residents)

Independent/predictor variable ( e.g. , educational intervention or characteristic of the learners)

Dependent/outcome variable ( e.g. , intubation skills or knowledge of anesthetic agents)

Relationship between the variables ( e.g. , “improve” or “mitigate”)

Occasionally, it may be difficult to differentiate the independent study variable from the dependent study variable. 17   For example, consider a study aiming to measure the relationship between burnout and personal debt among anesthesiology residents. Do the researchers believe burnout might lead to high personal debt, or that high personal debt may lead to burnout? This “chicken or egg” conundrum reinforces the importance of the conceptual framework which, if present, should serve as an explanation or rationale for the predicted relationship between study variables.

Research methodology is the “…design or plan that shapes the methods to be used in a study.” 1   Essentially, methodology is the general strategy for answering a research question, whereas methods are the specific steps and techniques that are used to collect data and implement the strategy. Our objective here is to provide an overview of quantitative methodologies ( i.e. , approaches) in medical education research.

The choice of research methodology is made by balancing the approach that best answers the research question against the feasibility of completing the study. There is no perfect methodology because each has its own potential caveats, flaws and/or sources of bias. Before delving into an overview of the methodologies, it is important to highlight common sources of bias in education research. We use the term internal validity to describe the degree to which the findings of a research study represent “the truth,” as opposed to some alternative hypothesis or variables. 18   Table 3   18–20   provides a list of common threats to internal validity in medical education research, along with tactics to mitigate these threats.

Threats to Internal Validity and Strategies to Mitigate Their Effects

Threats to Internal Validity and Strategies to Mitigate Their Effects

Experimental Research

The fundamental tenet of experimental research is the manipulation of an independent or experimental variable to measure its effect on a dependent or outcome variable.

True Experiment

True experimental study designs minimize threats to internal validity by randomizing study subjects to experimental and control groups. Through ensuring that differences between groups are—beyond the intervention/variable of interest—purely due to chance, researchers reduce the internal validity threats related to subject characteristics, time-related maturation, and regression to the mean. 18 , 19  

Quasi-experiment

There are many instances in medical education where randomization may not be feasible or ethical. For instance, researchers wanting to test the effect of a new curriculum among medical students may not be able to randomize learners due to competing curricular obligations and schedules. In these cases, researchers may be forced to assign subjects to experimental and control groups based upon some other criterion beyond randomization, such as different classrooms or different sections of the same course. This process, called quasi-randomization, does not inherently lead to internal validity threats, as long as research investigators are mindful of measuring and controlling for extraneous variables between study groups. 19  

Single-group Methodologies

All experimental study designs compare two or more groups: experimental and control. A common experimental study design in medical education research is the single-group pretest–posttest design, which compares a group of learners before and after the implementation of an intervention. 21   In essence, a single-group pre–post design compares an experimental group ( i.e. , postintervention) to a “no-intervention” control group ( i.e. , preintervention). 19   This study design is problematic for several reasons. Consider the following hypothetical example: A research article reports the effects of a year-long intubation curriculum for first-year anesthesiology residents. All residents participate in monthly, half-day workshops over the course of an academic year. The article reports a positive effect on residents’ skills as demonstrated by a significant improvement in intubation success rates at the end of the year when compared to the beginning.

This study does little to advance the science of learning among anesthesiology residents. While this hypothetical report demonstrates an improvement in residents’ intubation success before versus after the intervention, it does not tell why the workshop worked, how it compares to other educational interventions, or how it fits in to the broader picture of anesthesia training.

Single-group pre–post study designs open themselves to a myriad of threats to internal validity. 20   In our hypothetical example, the improvement in residents’ intubation skills may have been due to other educational experience(s) ( i.e. , implementation threat) and/or improvement in manual dexterity that occurred naturally with time ( i.e. , maturation threat), rather than the airway curriculum. Consequently, single-group pre–post studies should be interpreted with caution. 18  

Repeated testing, before and after the intervention, is one strategy that can be used to reduce the some of the inherent limitations of the single-group study design. Repeated pretesting can mitigate the effect of regression toward the mean, a statistical phenomenon whereby low pretest scores tend to move closer to the mean on subsequent testing (regardless of intervention). 20   Likewise, repeated posttesting at multiple time intervals can provide potentially useful information about the short- and long-term effects of an intervention ( e.g. , the “durability” of the gain in knowledge, skill, or attitude).

Observational Research

Unlike experimental studies, observational research does not involve manipulation of any variables. These studies often involve measuring associations, developing psychometric instruments, or conducting surveys.

Association Research

Association research seeks to identify relationships between two or more variables within a group or groups (correlational research), or similarities/differences between two or more existing groups (causal–comparative research). For example, correlational research might seek to measure the relationship between burnout and educational debt among anesthesiology residents, while causal–comparative research may seek to measure differences in educational debt and/or burnout between anesthesiology and surgery residents. Notably, association research may identify relationships between variables, but does not necessarily support a causal relationship between them.

Psychometric and Survey Research

Psychometric instruments measure a psychologic or cognitive construct such as knowledge, satisfaction, beliefs, and symptoms. Surveys are one type of psychometric instrument, but many other types exist, such as evaluations of direct observation, written examinations, or screening tools. 22   Psychometric instruments are ubiquitous in medical education research and can be used to describe a trait within a study population ( e.g. , rates of depression among medical students) or to measure associations between study variables ( e.g. , association between depression and board scores among medical students).

Psychometric and survey research studies are prone to the internal validity threats listed in table 3 , particularly those relating to mortality, location, and instrumentation. 18   Additionally, readers must ensure that the instrument scores can be trusted to truly represent the construct being measured. For example, suppose you encounter a research article demonstrating a positive association between attending physician teaching effectiveness as measured by a survey of medical students, and the frequency with which the attending physician provides coffee and doughnuts on rounds. Can we be confident that this survey administered to medical students is truly measuring teaching effectiveness? Or is it simply measuring the attending physician’s “likability”? Issues related to measurement and the trustworthiness of data are described in detail in the following section on measurement and the related issues of validity and reliability.

Measurement refers to “the assigning of numbers to individuals in a systematic way as a means of representing properties of the individuals.” 23   Research data can only be trusted insofar as we trust the measurement used to obtain the data. Measurement is of particular importance in medical education research because many of the constructs being measured ( e.g. , knowledge, skill, attitudes) are abstract and subject to measurement error. 24   This section highlights two specific issues related to the trustworthiness of data: the validity and reliability of measurements.

Validity regarding the scores of a measurement instrument “refers to the degree to which evidence and theory support the interpretations of the [instrument’s results] for the proposed use of the [instrument].” 25   In essence, do we believe the results obtained from a measurement really represent what we were trying to measure? Note that validity evidence for the scores of a measurement instrument is separate from the internal validity of a research study. Several frameworks for validity evidence exist. Table 4 2 , 22 , 26   represents the most commonly used framework, developed by Messick, 27   which identifies sources of validity evidence—to support the target construct—from five main categories: content, response process, internal structure, relations to other variables, and consequences.

Sources of Validity Evidence for Measurement Instruments

Sources of Validity Evidence for Measurement Instruments

Reliability

Reliability refers to the consistency of scores for a measurement instrument. 22 , 25 , 28   For an instrument to be reliable, we would anticipate that two individuals rating the same object of measurement in a specific context would provide the same scores. 25   Further, if the scores for an instrument are reliable between raters of the same object of measurement, then we can extrapolate that any difference in scores between two objects represents a true difference across the sample, and is not due to random variation in measurement. 29   Reliability can be demonstrated through a variety of methods such as internal consistency ( e.g. , Cronbach’s alpha), temporal stability ( e.g. , test–retest reliability), interrater agreement ( e.g. , intraclass correlation coefficient), and generalizability theory (generalizability coefficient). 22 , 29  

Example of a Validity and Reliability Argument

This section provides an illustration of validity and reliability in medical education. We use the signaling questions outlined in table 4 to make a validity and reliability argument for the Harvard Assessment of Anesthesia Resident Performance (HARP) instrument. 7   The HARP was developed by Blum et al. to measure the performance of anesthesia trainees that is required to provide safe anesthetic care to patients. According to the authors, the HARP is designed to be used “…as part of a multiscenario, simulation-based assessment” of resident performance. 7  

Content Validity: Does the Instrument’s Content Represent the Construct Being Measured?

To demonstrate content validity, instrument developers should describe the construct being measured and how the instrument was developed, and justify their approach. 25   The HARP is intended to measure resident performance in the critical domains required to provide safe anesthetic care. As such, investigators note that the HARP items were created through a two-step process. First, the instrument’s developers interviewed anesthesiologists with experience in resident education to identify the key traits needed for successful completion of anesthesia residency training. Second, the authors used a modified Delphi process to synthesize the responses into five key behaviors: (1) formulate a clear anesthetic plan, (2) modify the plan under changing conditions, (3) communicate effectively, (4) identify performance improvement opportunities, and (5) recognize one’s limits. 7 , 30  

Response Process Validity: Are Raters Interpreting the Instrument Items as Intended?

In the case of the HARP, the developers included a scoring rubric with behavioral anchors to ensure that faculty raters could clearly identify how resident performance in each domain should be scored. 7  

Internal Structure Validity: Do Instrument Items Measuring Similar Constructs Yield Homogenous Results? Do Instrument Items Measuring Different Constructs Yield Heterogeneous Results?

Item-correlation for the HARP demonstrated a high degree of correlation between some items ( e.g. , formulating a plan and modifying the plan under changing conditions) and a lower degree of correlation between other items ( e.g. , formulating a plan and identifying performance improvement opportunities). 30   This finding is expected since the items within the HARP are designed to assess separate performance domains, and we would expect residents’ functioning to vary across domains.

Relationship to Other Variables’ Validity: Do Instrument Scores Correlate with Other Measures of Similar or Different Constructs as Expected?

As it applies to the HARP, one would expect that the performance of anesthesia residents will improve over the course of training. Indeed, HARP scores were found to be generally higher among third-year residents compared to first-year residents. 30  

Consequence Validity: Are Instrument Results Being Used as Intended? Are There Unintended or Negative Uses of the Instrument Results?

While investigators did not intentionally seek out consequence validity evidence for the HARP, unanticipated consequences of HARP scores were identified by the authors as follows:

“Data indicated that CA-3s had a lower percentage of worrisome scores (rating 2 or lower) than CA-1s… However, it is concerning that any CA-3s had any worrisome scores…low performance of some CA-3 residents, albeit in the simulated environment, suggests opportunities for training improvement.” 30  

That is, using the HARP to measure the performance of CA-3 anesthesia residents had the unintended consequence of identifying the need for improvement in resident training.

Reliability: Are the Instrument’s Scores Reproducible and Consistent between Raters?

The HARP was applied by two raters for every resident in the study across seven different simulation scenarios. The investigators conducted a generalizability study of HARP scores to estimate the variance in assessment scores that was due to the resident, the rater, and the scenario. They found little variance was due to the rater ( i.e. , scores were consistent between raters), indicating a high level of reliability. 7  

Sampling refers to the selection of research subjects ( i.e. , the sample) from a larger group of eligible individuals ( i.e. , the population). 31   Effective sampling leads to the inclusion of research subjects who represent the larger population of interest. Alternatively, ineffective sampling may lead to the selection of research subjects who are significantly different from the target population. Imagine that researchers want to explore the relationship between burnout and educational debt among pain medicine specialists. The researchers distribute a survey to 1,000 pain medicine specialists (the population), but only 300 individuals complete the survey (the sample). This result is problematic because the characteristics of those individuals who completed the survey and the entire population of pain medicine specialists may be fundamentally different. It is possible that the 300 study subjects may be experiencing more burnout and/or debt, and thus, were more motivated to complete the survey. Alternatively, the 700 nonresponders might have been too busy to respond and even more burned out than the 300 responders, which would suggest that the study findings were even more amplified than actually observed.

When evaluating a medical education research article, it is important to identify the sampling technique the researchers employed, how it might have influenced the results, and whether the results apply to the target population. 24  

Sampling Techniques

Sampling techniques generally fall into two categories: probability- or nonprobability-based. Probability-based sampling ensures that each individual within the target population has an equal opportunity of being selected as a research subject. Most commonly, this is done through random sampling, which should lead to a sample of research subjects that is similar to the target population. If significant differences between sample and population exist, those differences should be due to random chance, rather than systematic bias. The difference between data from a random sample and that from the population is referred to as sampling error. 24  

Nonprobability-based sampling involves selecting research participants such that inclusion of some individuals may be more likely than the inclusion of others. 31   Convenience sampling is one such example and involves selection of research subjects based upon ease or opportuneness. Convenience sampling is common in medical education research, but, as outlined in the example at the beginning of this section, it can lead to sampling bias. 24   When evaluating an article that uses nonprobability-based sampling, it is important to look for participation/response rate. In general, a participation rate of less than 75% should be viewed with skepticism. 21   Additionally, it is important to determine whether characteristics of participants and nonparticipants were reported and if significant differences between the two groups exist.

Interpreting medical education research requires a basic understanding of common ways in which quantitative data are analyzed and displayed. In this section, we highlight two broad topics that are of particular importance when evaluating research articles.

The Nature of the Measurement Variable

Measurement variables in quantitative research generally fall into three categories: nominal, ordinal, or interval. 24   Nominal variables (sometimes called categorical variables) involve data that can be placed into discrete categories without a specific order or structure. Examples include sex (male or female) and professional degree (M.D., D.O., M.B.B.S., etc .) where there is no clear hierarchical order to the categories. Ordinal variables can be ranked according to some criterion, but the spacing between categories may not be equal. Examples of ordinal variables may include measurements of satisfaction (satisfied vs . unsatisfied), agreement (disagree vs . agree), and educational experience (medical student, resident, fellow). As it applies to educational experience, it is noteworthy that even though education can be quantified in years, the spacing between years ( i.e. , educational “growth”) remains unequal. For instance, the difference in performance between second- and third-year medical students is dramatically different than third- and fourth-year medical students. Interval variables can also be ranked according to some criteria, but, unlike ordinal variables, the spacing between variable categories is equal. Examples of interval variables include test scores and salary. However, the conceptual boundaries between these measurement variables are not always clear, as in the case where ordinal scales can be assumed to have the properties of an interval scale, so long as the data’s distribution is not substantially skewed. 32  

Understanding the nature of the measurement variable is important when evaluating how the data are analyzed and reported. Medical education research commonly uses measurement instruments with items that are rated on Likert-type scales, whereby the respondent is asked to assess their level of agreement with a given statement. The response is often translated into a corresponding number ( e.g. , 1 = strongly disagree, 3 = neutral, 5 = strongly agree). It is remarkable that scores from Likert-type scales are sometimes not normally distributed ( i.e. , are skewed toward one end of the scale), indicating that the spacing between scores is unequal and the variable is ordinal in nature. In these cases, it is recommended to report results as frequencies or medians, rather than means and SDs. 33  

Consider an article evaluating medical students’ satisfaction with a new curriculum. Researchers measure satisfaction using a Likert-type scale (1 = very unsatisfied, 2 = unsatisfied, 3 = neutral, 4 = satisfied, 5 = very satisfied). A total of 20 medical students evaluate the curriculum, 10 of whom rate their satisfaction as “satisfied,” and 10 of whom rate it as “very satisfied.” In this case, it does not make much sense to report an average score of 4.5; it makes more sense to report results in terms of frequency ( e.g. , half of the students were “very satisfied” with the curriculum, and half were not).

Effect Size and CIs

In medical education, as in other research disciplines, it is common to report statistically significant results ( i.e. , small P values) in order to increase the likelihood of publication. 34 , 35   However, a significant P value in itself does necessarily represent the educational impact of the study results. A statement like “Intervention x was associated with a significant improvement in learners’ intubation skill compared to education intervention y ( P < 0.05)” tells us that there was a less than 5% chance that the difference in improvement between interventions x and y was due to chance. Yet that does not mean that the study intervention necessarily caused the nonchance results, or indicate whether the between-group difference is educationally significant. Therefore, readers should consider looking beyond the P value to effect size and/or CI when interpreting the study results. 36 , 37  

Effect size is “the magnitude of the difference between two groups,” which helps to quantify the educational significance of the research results. 37   Common measures of effect size include Cohen’s d (standardized difference between two means), risk ratio (compares binary outcomes between two groups), and Pearson’s r correlation (linear relationship between two continuous variables). 37   CIs represent “a range of values around a sample mean or proportion” and are a measure of precision. 31   While effect size and CI give more useful information than simple statistical significance, they are commonly omitted from medical education research articles. 35   In such instances, readers should be wary of overinterpreting a P value in isolation. For further information effect size and CI, we direct readers the work of Sullivan and Feinn 37   and Hulley et al. 31  

In this final section, we identify instruments that can be used to evaluate the quality of quantitative medical education research articles. To this point, we have focused on framing the study and research methodologies and identifying potential pitfalls to consider when appraising a specific article. This is important because how a study is framed and the choice of methodology require some subjective interpretation. Fortunately, there are several instruments available for evaluating medical education research methods and providing a structured approach to the evaluation process.

The Medical Education Research Study Quality Instrument (MERSQI) 21   and the Newcastle Ottawa Scale-Education (NOS-E) 38   are two commonly used instruments, both of which have an extensive body of validity evidence to support the interpretation of their scores. Table 5 21 , 39   provides more detail regarding the MERSQI, which includes evaluation of study design, sampling, data type, validity, data analysis, and outcomes. We have found that applying the MERSQI to manuscripts, articles, and protocols has intrinsic educational value, because this practice of application familiarizes MERSQI users with fundamental principles of medical education research. One aspect of the MERSQI that deserves special mention is the section on evaluating outcomes based on Kirkpatrick’s widely recognized hierarchy of reaction, learning, behavior, and results ( table 5 ; fig .). 40   Validity evidence for the scores of the MERSQI include its operational definitions to improve response process, excellent reliability, and internal consistency, as well as high correlation with other measures of study quality, likelihood of publication, citation rate, and an association between MERSQI score and the likelihood of study funding. 21 , 41   Additionally, consequence validity for the MERSQI scores has been demonstrated by its utility for identifying and disseminating high-quality research in medical education. 42  

Fig. Kirkpatrick’s hierarchy of outcomes as applied to education research. Reaction = Level 1, Learning = Level 2, Behavior = Level 3, Results = Level 4. Outcomes become more meaningful, yet more difficult to achieve, when progressing from Level 1 through Level 4. Adapted with permission from Beckman and Cook, 2007.2

Kirkpatrick’s hierarchy of outcomes as applied to education research. Reaction = Level 1, Learning = Level 2, Behavior = Level 3, Results = Level 4. Outcomes become more meaningful, yet more difficult to achieve, when progressing from Level 1 through Level 4. Adapted with permission from Beckman and Cook, 2007. 2  

The Medical Education Research Study Quality Instrument for Evaluating the Quality of Medical Education Research

The Medical Education Research Study Quality Instrument for Evaluating the Quality of Medical Education Research

The NOS-E is a newer tool to evaluate the quality of medication education research. It was developed as a modification of the Newcastle-Ottawa Scale 43   for appraising the quality of nonrandomized studies. The NOS-E includes items focusing on the representativeness of the experimental group, selection and compatibility of the control group, missing data/study retention, and blinding of outcome assessors. 38 , 39   Additional validity evidence for NOS-E scores includes operational definitions to improve response process, excellent reliability and internal consistency, and its correlation with other measures of study quality. 39   Notably, the complete NOS-E, along with its scoring rubric, can found in the article by Cook and Reed. 39  

A recent comparison of the MERSQI and NOS-E found acceptable interrater reliability and good correlation between the two instruments 39   However, noted differences exist between the MERSQI and NOS-E. Specifically, the MERSQI may be applied to a broad range of study designs, including experimental and cross-sectional research. Additionally, the MERSQI addresses issues related to measurement validity and data analysis, and places emphasis on educational outcomes. On the other hand, the NOS-E focuses specifically on experimental study designs, and on issues related to sampling techniques and outcome assessment. 39   Ultimately, the MERSQI and NOS-E are complementary tools that may be used together when evaluating the quality of medical education research.

Conclusions

This article provides an overview of quantitative research in medical education, underscores the main components of education research, and provides a general framework for evaluating research quality. We highlighted the importance of framing a study with respect to purpose, conceptual framework, and statement of study intent. We reviewed the most common research methodologies, along with threats to the validity of a study and its measurement instruments. Finally, we identified two complementary instruments, the MERSQI and NOS-E, for evaluating the quality of a medical education research study.

Bordage G: Conceptual frameworks to illuminate and magnify. Medical education. 2009; 43(4):312–9.

Cook DA, Beckman TJ: Current concepts in validity and reliability for psychometric instruments: Theory and application. The American journal of medicine. 2006; 119(2):166. e7–166. e116.

Franenkel JR, Wallen NE, Hyun HH: How to Design and Evaluate Research in Education. 9th edition. New York, McGraw-Hill Education, 2015.

Hulley SB, Cummings SR, Browner WS, Grady DG, Newman TB: Designing clinical research. 4th edition. Philadelphia, Lippincott Williams & Wilkins, 2011.

Irby BJ, Brown G, Lara-Alecio R, Jackson S: The Handbook of Educational Theories. Charlotte, NC, Information Age Publishing, Inc., 2015

Standards for Educational and Psychological Testing (American Educational Research Association & American Psychological Association, 2014)

Swanwick T: Understanding medical education: Evidence, theory and practice, 2nd edition. Wiley-Blackwell, 2013.

Sullivan GM, Artino Jr AR: Analyzing and interpreting data from Likert-type scales. Journal of graduate medical education. 2013; 5(4):541–2.

Sullivan GM, Feinn R: Using effect size—or why the P value is not enough. Journal of graduate medical education. 2012; 4(3):279–82.

Tavakol M, Sandars J: Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part II. Medical teacher. 2014; 36(10):838–48.

Support was provided solely from institutional and/or departmental sources.

The authors declare no competing interests.

Citing articles via

Most viewed, email alerts, related articles, social media, affiliations.

  • ASA Practice Parameters
  • Online First
  • Author Resource Center
  • About the Journal
  • Editorial Board
  • Rights & Permissions
  • Online ISSN 1528-1175
  • Print ISSN 0003-3022
  • Anesthesiology
  • ASA Monitor

Silverchair Information Systems

  • Terms & Conditions Privacy Policy
  • Manage Cookie Preferences
  • © Copyright 2024 American Society of Anesthesiologists

This Feature Is Available To Subscribers Only

Sign In or Create an Account

ReviseSociology

A level sociology revision – education, families, research methods, crime and deviance and more!

Research Methods

Table of Contents

Last Updated on October 13, 2023 by Karl Thompson

Sociologists use a range of quantitative and qualitative, primary and secondary social research methods to collect data about society.

The main types of research method are:

  • Social surveys (questionnaires and structured interviews)
  • Experiments (Lab and Field)
  • Unstructured interviews
  • Partipant Observation
  • Secondary qualitative data
  • Official Statistics.

This page provides links to more in depth posts on all of the above research methods. It has primarily been written for students studying the A Level Sociology AQA 7192 specification, and incorporates Methods in the Context of Education.

quantitative research methods education

Research Methods at a Glance – Key Concepts  

Research Methods Top Ten Key Concepts – start here if you’re all at sea – includes simple explanations of terms such as validity, reliability, representativeness, Positivism and Interpretivism .

Research Methods A-Z Glossary – a more comprehensive index of the key terms you need to know for AS and A Level Sociology .

quantitative research methods education

An Introduction to Research Methods

Without research methods there is no sociology!

This section covers the basics of the different types of research method and factors influencing choice of research methods, also the important distinction between Positivism and Interpretivism.

Research Methods in Sociology – An Introduction  – d etailed class notes covering the basic types of research method available to sociologists such as social surveys, interviews, experiments, and observations

Factors Effecting Choice of Research Topic in Sociology  – detailed class notes on the theoretical, ethical, and practical factors effecting the choice of research methods

Factors Effecting Choice of Research Method in Sociology  – detailed class notes covering theoretical, practical and ethical factors and the nature of topic. NB choice of topic will affect choice of research method. Choice of topic and method are different issues! 

Positivism and Interpretivism – Positivists generally prefer quantitative methods, Interpretivists prefer qualitative methods – this post consists of brief summary revision notes and revision diagrams outlining the difference between positivist and interpretivist approaches to social research. 

Positivism, Sociology and Social Research   – detailed class notes on the relationship between The Enlightenment, industrialisation and positivist sociology, which sees sociology as a science.  

Stages of Social Research  – detailed class notes covering research design, operationalising concepts, sampling, pilot studies, data collection and data analysis. 

Outline and explain two practical problems which might affect social research (10) –  A model answer to this exam question, which could appear on either paper 7191 (1) or 7191 (3). 

Good Resources for Teaching and Learning Research Method s –  simply links (with brief descriptions) which take you to a range of text books and web sites which focus on various aspects of quantitative and qualitative research methods. NB this post is very much a work in progress, being updated constantly. 

Primary Quantitative Research Methods

  social surveys.

An Introduction to Social Surveys  – a brief introduction to the use of different types of survey in social research, including structured questionnaires and interviews and different ways of administering surveys such as online, by phone or face to face.

The advantages and disadvantages of social surveys in social research  –  detailed class notes covering the theoretical, practical and ethical strengths and limitations of social surveys. Generally, surveys are preferred by positivists and good for simple topics, but not so good for more complex topics which require a ‘human touch’ .

Structured Interviews in Social Research – Interviews are effectively one of the means of administering social surveys. This post covers the different contexts (types) of structured interview, and the stages of doing them. It also looks at the strengths, limitations and criticisms.

Experiments

An Introduction to Experiments in Sociology   – a brief introduction covering definitions of key terms including hypotheses, dependent and independent variables and the Hawthorne Effect. NB sociologists don’t generally use experiments, especially not lab experiments, but you still need to know about them! 

Laboratory Experiments in Sociology   – detailed class notes on the strengths and limitations of laboratory experiments. Sociologists don’t generally use lab experiments, but examiners seem to ask questions about them more than other methods – one hypothesis for why is that sociology examiners have a burning hatred of teenagers. 

Field Experiments in Sociology   – detailed class notes on the strengths and limitations of field experiments. Field experiments take place in real life social settings so are more ‘sociological’ than lab experiments.

Seven Examples of Field Experiment for Sociology  –  class notes outlining a mixture of seven classic and contemporary field experiments relevant to various aspects of the AS and A level sociology syllabus .

Longitudinal Studies

Longitudinal Studies – These are interval studies designed to explore changes over a long period of time. Researchers start with a sample and keep going back to that same sample periodically – say every year, or every two years, to explore how and why changes occur.

The Seven Up Series – an in-depth look at Britain’s longest running and best loved Longitudinal study.

What Makes a Good Life ? – Lessons from a Longitudinal Study – This is one of the longest running Longitudinal studies in the world – the respondents were in their 20s when it started, now those who are still alive are in their 80s.

Primary Qualitative Research Methods

Primary qualitative research methods tend to be favoured by Interpretivists as they allow respondents to speak for themselves, and should thus yield valid data. However, because qualitative methods tend to involve the researcher getting more involved with the respondents, there is a risk that the subjective views of the researcher could interfere with the results, which could compromise both the validity and reliability of such methods.

Qualitative research methods also tend to be time consuming and so it can be difficult to to them with large samples of people.

Participant Observation

Overt and Covert  Participant Observation  –  Participant Observation is where researchers take part in the life of respondents, sometimes for several months or even years, and try to ‘see the world through their eyes’. Overt research is where respondents know the researcher is doing research, covert is where the researcher is undercover.

The strengths and limitations of covert participant observation – sociologists don’t generally use covert participant observation because of the ethical problem of deception means they can’t get funding. This methods is more commonly used by journalists doing investigative reporting, or you could even say undercover police officers use it, and you can use these examples to illustrate the advantages and disadvantages of this method.

Some recent examples of sociological studies using participant observation – including Pearson’s covert research into football hooligans and Mears’s research into the modelling industry.

Non-Participant Observation  –  detailed class notes on non-participant observation. This is where the researcher observes from the sidelines and makes observations. Probably the most commonly used form of this is the OFSTED inspection.

Interviews in Social Research  –  This post consists of detailed class notes focusing strengths and limitations of mainly unstructured interviews, which are like a guided conversation that allow respondents the freedom to speak for themselves.

Secondary Research Methods

Official statistics.

Official Statistics in Sociology  –  class notes on the general strengths and limitations of official statistics, which are numerical data collected by the government. Examples include crime statistics, school league tables and education statistics.

Evaluating the Usefulness of Official Statistics – the UK government collects a wide variety of statistics, the validity of which can vary enormously. This post explores the validity of Religious belief statistics, crime and prison statistics, and immigration data, among other sources of data.

Cross National Comparisons – Comparing data across countries using official statistics can provide insight into the causes of social problems such as poverty, and war and conflict. This post looks at how you might go about doing this and the strengths and limitations of this kind of research.

Univariate Analysis in Quantitative Social Research – This involves looking at one variable at a time. This post covers the strengths and limitations of bar charts, pie charts and box plots.

Secondary Qualitative Data

Secondary Qualitative Data Analysis in Sociology – class notes covering private and public documents. Public documents include any written or visual document produced with an audience in mind, such things as government reports and newspapers, whereas private documents refer to personal documents such as diaries and letters which are not intended to be seen by their authors.

Content Analysis of the Media in Social Research  –  class notes covering formal content (quantitative) analysis and semiology .

Personal documents in social research – a more in-depth look at the strengths and limitations of using sources such as diaries and letters as sources of data.

Autobiographies in social research – Autobiographies are popular with the British public, but how useful are they as sources of data for the social researcher?

Sociology, Science and Value Freedom (Part of A2 Theory and Methods)

Sociology and Value Freedom  – Detailed class notes .

Methods in Context – Research Methods Applied to Education

Field Experiments applied to Education  – are Chinese Teaching Methods the Best? This is a summary of a documentary in which some students at one school undertook a Chinese style of teaching for 3 months, involving 12 hour days and ‘teach from the front techniques’. The students were then tested and their results compared to students from the same school who stuck to the traditional British way of teaching. The results may surprise you!

Participant Observation in Education  –  focusing on the work of Paul Willis and Mac An Ghail.

Non-Participant Observation in Education  –  focusing on OFSTED inspections, as these are probably the most commonly used of all methods in education .

The Strengths and Limitations of Education Statistics  – This post discusses the strengths and limitations of results statistics. NB these may not be as valid as you think .

Evaluating the Usefulness of Secondary Qualitative Data to Research Education  –  there are lot of documents sociologists may use to research education, including school promotional literature and web sites, policy documents, written records on students, and, if they can access them, personal messages between students referring to what they think about school.

Focus on the AS and A Level Exams

Research Methods Practice Questions for A-level Sociology – you will get a 10 mark question on both papers SCLY1 and SCLY3 most likely asking you to ‘outline and explain’ the strengths and limitations of any of the main research methods. This post outlines some of the many variations.

Research Methods Essays – How to Write Them – general advice on writing research methods essays for the AS and A level sociology exams. This post covers the PET technique – Practical, Ethical and Theoretical.

Assess the Strengths of Using Participant Observation in Social Research (20) – example essay, top mark band.

Methods in Context Essay Template  – a suggested gap fill essay plan on how to answer these challenging ‘applied research methods’ questions.

Methods in Context Mark Scheme  – pared down mark scheme – easy to understand! It may surprise you to know that you can get up to 12/20 for just writing about the method, without even applying it to the question!

Outline and explain two advantages of overt compared to covert observation (10) – you might think that being undercover provides you with more valid data than when respondents know you are observing them, however, there are a few reasons why this might not be the case. This post explores why, and some of the other advantages overt has over covert observation. (Honestly, covert is a lot of hassle!). NB this post is written as a response to an exam style question .

Using Participant Observation to research pupils with behavourial difficulties (20) – a model answer for this methods in context style of essay.

For more links to methods and applied methods essays see my page – ‘ Exams, Essays and Short Answer Questions ‘.

Other Relevant Posts

Learning to Labour by Paul Willis – Summary and Evaluation of Research Methods .

How old are twitter users? – applied sociology – illustrates some of the problems us using social media to uncover social trends.

Twitter users by occupation and social class – applied sociology – illustrates some of the problems us using social media to uncover social trends.

Other posts and links will be forthcoming throughout 2020, check back soon .

Theory and Methods A Level Sociology Revision Bundle 

If you like this sort of thing, then you might like my Theory and Methods Revision Bundle – specifically designed to get students through the theory and methods sections of  A level sociology papers 1 and 3.

quantitative research methods education

Contents include:

  • 74 pages of revision notes
  • 15 mind maps on various topics within theory and methods
  • Five theory and methods essays
  • ‘How to write methods in context essays’.

For better value I’ve bundled all of the above topics into six revision bundles , containing revision notes, mind maps, and exam question and answers, available for between £4.99 and £5.99 on Sellfy .

Mega Bundle Cover

Best value is my A level sociology revision mega bundle – which contains the following:

  • over 200 pages of revision notes
  • 60 mind maps in pdf and png formats
  • 50 short answer exam practice questions and exemplar answers
  • Covers the entire A-level sociology syllabus, AQA focus.

Share this:

  • Share on Tumblr

IMAGES

  1. Quantitative Research

    quantitative research methods education

  2. Types of Quantitative Research

    quantitative research methods education

  3. PPT

    quantitative research methods education

  4. A Guide To Quantitative Research Methods And Types

    quantitative research methods education

  5. Quantitative Research Methods, Types and Examples

    quantitative research methods education

  6. 100+ Best Quantitative Research Topics For Students In 2023

    quantitative research methods education

VIDEO

  1. Lecture 41: Quantitative Research

  2. Lecture 40: Quantitative Research: Case Study

  3. Lecture 44: Quantitative Research

  4. Quantitative Research

  5. Exploring Qualitative and Quantitative Research Methods and why you should use them

  6. Quantitative Research, Qualitative Research

COMMENTS

  1. Qualitative vs. Quantitative Research: Comparing the Methods and

    Qualitative vs. Quantitative Research in Education: Definitions. Although there are many overlaps in the objectives of qualitative and quantitative research in education, researchers must understand the fundamental functions of each methodology in order to design and carry out an impactful research study.

  2. Quantitative Research Designs in Educational Research

    Research and evaluation in education and psychology: Integrating diversity with quantitative, qualitative, and mixed methods. 3d ed. Thousand Oaks, CA: SAGE. This textbook is an introduction to all types of educational designs and includes four chapters devoted to quantitative research—experimental and quasi-experimental, causal comparative ...

  3. (PDF) Quantitative Research in Education

    The. quantitative research methods in education emphasise basic group designs. for research and evaluation, analytic metho ds for exploring re lationships. between categorical and continuous ...

  4. PDF Introduction to quantitative research

    Mixed-methods research is a flexible approach, where the research design is determined by what we want to find out rather than by any predetermined epistemological position. In mixed-methods research, qualitative or quantitative components can predominate, or both can have equal status. 1.4. Units and variables.

  5. Quantitative research in education : Background information

    Educational research has a strong tradition of employing state-of-the-art statistical and psychometric (psychological measurement) techniques. Commonly referred to as quantitative methods, these techniques cover a range of statistical tests and tools. The Sage encyclopedia of educational research, measurement, and evaluation by Bruce B. Frey (Ed.)

  6. Critical Quantitative Literacy: An Educational Foundation for Critical

    Quantitative research in the social sciences is undergoing a change. After years of scholarship on the oppressive history of quantitative methods, quantitative scholars are grappling with the ways that our preferred methodology reinforces social injustices (Zuberi, 2001).Among others, the emerging fields of CritQuant (critical quantitative studies) and QuantCrit (quantitative critical race ...

  7. What Is Quantitative Research?

    Quantitative research methods. You can use quantitative research methods for descriptive, correlational or experimental research. In descriptive research, you simply seek an overall summary of your study variables.; In correlational research, you investigate relationships between your study variables.; In experimental research, you systematically examine whether there is a cause-and-effect ...

  8. Conducting Quantitative Research in Education

    This book presents a clear and straightforward guide for all those seeking to conduct quantitative research in the field of education, using primary research data samples. It provides educational researchers with the tools they can work with to achieve results efficiently.

  9. Handbook of Quantitative Methods for Educational Research

    This handbook serves to act as a reference for educational researchers and practitioners who desire to acquire knowledge and skills in quantitative methods for data analysis or to obtain deeper insights from published works. Written by experienced researchers and educators, each chapter in this handbook covers a methodological topic with ...

  10. Quantitative Research in Education

    Features. Preview. "The book provides a reference point for beginning educational researchers to grasp the most pertinent elements of designing and conducting research…". —Megan Tschannen-Moran, The College of William & Mary. Quantitative Research in Education: A Primer, Second Edition is a brief and practical text designed to allay ...

  11. Quantitative methods in education

    Students in Quantitative Methods in Education engage in the science and practice of educational measurement and statistics, primarily through the development and application of statistical and psychometric methods. All QME students will engage in coursework addressing fundamental topics related to statistics, educational measurement, research ...

  12. Quantitative research in education : Recent e-books

    Educational research in higher education: methods and experiences by José Gómez-Galán (Ed.) Publication Date: 2016 In recent years, increasingly more higher education research focuses on a combination of quantitative and qualitative approaches, thus becoming an integrated component.

  13. Introduction to Educational Research

    Preview. This Third Edition of Craig Mertler's practical text helps readers every step of the way as they plan and execute their first educational research project. Offering balanced coverage of qualitative and quantitative methods, an emphasis on ethics, and a wealth of new examples and concrete applications, the new edition continues to use ...

  14. PhD in Educational Psychology: Quantitative Methods

    Educational research has a strong tradition of employing state-of-the-art statistical and psychometric (psychological measurement) techniques. Researchers in all areas of education develop measuring instruments, design and conduct experiments and surveys, and analyze data resulting from these activities. Because of this tradition, quantitative methods has long been an area of specialization ...

  15. Research Methods in Education

    This thoroughly updated and extended eighth edition of the long-running bestseller Research Methods in Education covers the whole range of methods employed by educational research at all stages. Its five main parts cover: the context of educational research; research design; methodologies for educational research; methods of data collection; and data analysis and reporting.

  16. Quantitative Research Methods in Education Program

    Our PhD program in quantitative research methods in education (QRME) develops researchers, scholars, and policy leaders who engage in traditions of inquiry that create knowledge and understanding founded in empirical evidence. The program focuses on: Building a strong understanding of quantitative methods in research.

  17. Quantitative research in education : Journals

    Research in higher education. "Research in Higher Education publishes studies that examine issues pertaining to postsecondary education. The journal is open to studies using a wide range of methods, but has particular interest in studies that apply advanced quantitative research methods to issues in postsecondary education or address ...

  18. Quantitative Research

    Education Research: Quantitative research is used in education research to study the effectiveness of teaching methods, assess student learning outcomes, and identify factors that influence student success. Researchers use experimental and quasi-experimental designs, as well as surveys and other quantitative methods, to collect and analyze data.

  19. Quantitative Research in Education

    The quantitative research methods in education emphasise basic group designs for research and evaluation, analytic methods for exploring relationships between categorical and continuous measures, and statistical analysis procedures for group design data. The essential is to evaluate quantitative analysis and provide the research process ...

  20. Quantitative Research Methods in Education (PhD)

    Our PhD program in quantitative research methods in education (QRME) develops researchers, scholars, and policy leaders who engage in traditions of inquiry that create knowledge and understanding founded in empirical evidence. The program focuses on: Building a strong understanding of quantitative methods in research.

  21. PDF Quantitative Research Methods

    The method can be quantitative, qualitative, or mixed (e.g., a quantitative method 1). RESEARCH 2 Research refers to the systematic process of group assignment, selection, and data collection techniques. Research can be experimental, quasi-experimental, or non-experimental (e.g., a quantitative method 1 and experimental research 2). APPROACH 3 ...

  22. Quantitative Research Methods

    Quantitative Research Methods. Our research aims to measure the impact of educational interventions and programs, identify the factors that contribute to student success, develop new educational policies and practices, and evaluate the effectiveness of educational systems.

  23. Quantitative Research Methods in Medical Education

    There has been an explosion of research in the field of medical education. A search of PubMed demonstrates that more than 40,000 articles have been indexed under the medical subject heading "Medical Education" since 2010, which is more than the total number of articles indexed under this heading in the 1980s and 1990s combined.

  24. Research Methods

    Primary Quantitative Research Methods ... Research Methods Applied to Education. Field Experiments applied to Education - are Chinese Teaching Methods the Best? This is a summary of a documentary in which some students at one school undertook a Chinese style of teaching for 3 months, involving 12 hour days and 'teach from the front ...

  25. Education Sciences

    Introduction: We applied a convergent mixed-methods research design, focusing on data from Swiss students to identify patterns of resilience development in high school. Method: The study consisted of an online longitudinal survey conducted in two waves, in autumn 2019 (n = 377 grade seven) and spring 2021 (n = 257 grade eight). By combining latent transition analysis (LTA), a person-oriented ...

  26. RMI Education

    202 likes, 0 comments - rmieducation on April 24, 2024: "Advancing Qualitative, Quantitative and Mixed Methods Research with AI The International Conference on Health Research (ICHR-24) hosted ...