research ranking journals

  • Journal Rankings
  • Country Rankings
  • All subject areas
  • Agricultural and Biological Sciences
  • Arts and Humanities
  • Biochemistry, Genetics and Molecular Biology
  • Business, Management and Accounting
  • Chemical Engineering
  • Computer Science
  • Decision Sciences
  • Earth and Planetary Sciences
  • Economics, Econometrics and Finance
  • Engineering
  • Environmental Science
  • Health Professions
  • Immunology and Microbiology
  • Materials Science
  • Mathematics
  • Multidisciplinary
  • Neuroscience
  • Pharmacology, Toxicology and Pharmaceutics
  • Physics and Astronomy
  • Social Sciences
  • All subject categories
  • Acoustics and Ultrasonics
  • Advanced and Specialized Nursing
  • Aerospace Engineering
  • Agricultural and Biological Sciences (miscellaneous)
  • Agronomy and Crop Science
  • Algebra and Number Theory
  • Analytical Chemistry
  • Anesthesiology and Pain Medicine
  • Animal Science and Zoology
  • Anthropology
  • Applied Mathematics
  • Applied Microbiology and Biotechnology
  • Applied Psychology
  • Aquatic Science
  • Archeology (arts and humanities)
  • Architecture
  • Artificial Intelligence
  • Arts and Humanities (miscellaneous)
  • Assessment and Diagnosis
  • Astronomy and Astrophysics
  • Atmospheric Science
  • Atomic and Molecular Physics, and Optics
  • Automotive Engineering
  • Behavioral Neuroscience
  • Biochemistry
  • Biochemistry, Genetics and Molecular Biology (miscellaneous)
  • Biochemistry (medical)
  • Bioengineering
  • Biological Psychiatry
  • Biomaterials
  • Biomedical Engineering
  • Biotechnology
  • Building and Construction
  • Business and International Management
  • Business, Management and Accounting (miscellaneous)
  • Cancer Research
  • Cardiology and Cardiovascular Medicine
  • Care Planning
  • Cell Biology
  • Cellular and Molecular Neuroscience
  • Ceramics and Composites
  • Chemical Engineering (miscellaneous)
  • Chemical Health and Safety
  • Chemistry (miscellaneous)
  • Chiropractics
  • Civil and Structural Engineering
  • Clinical Biochemistry
  • Clinical Psychology
  • Cognitive Neuroscience
  • Colloid and Surface Chemistry
  • Communication
  • Community and Home Care
  • Complementary and Alternative Medicine
  • Complementary and Manual Therapy
  • Computational Mathematics
  • Computational Mechanics
  • Computational Theory and Mathematics
  • Computer Graphics and Computer-Aided Design
  • Computer Networks and Communications
  • Computer Science Applications
  • Computer Science (miscellaneous)
  • Computer Vision and Pattern Recognition
  • Computers in Earth Sciences
  • Condensed Matter Physics
  • Conservation
  • Control and Optimization
  • Control and Systems Engineering
  • Critical Care and Intensive Care Medicine
  • Critical Care Nursing
  • Cultural Studies
  • Decision Sciences (miscellaneous)
  • Dental Assisting
  • Dental Hygiene
  • Dentistry (miscellaneous)
  • Dermatology
  • Development
  • Developmental and Educational Psychology
  • Developmental Biology
  • Developmental Neuroscience
  • Discrete Mathematics and Combinatorics
  • Drug Discovery
  • Drug Guides
  • Earth and Planetary Sciences (miscellaneous)
  • Earth-Surface Processes
  • Ecological Modeling
  • Ecology, Evolution, Behavior and Systematics
  • Economic Geology
  • Economics and Econometrics
  • Economics, Econometrics and Finance (miscellaneous)
  • Electrical and Electronic Engineering
  • Electrochemistry
  • Electronic, Optical and Magnetic Materials
  • Emergency Medical Services
  • Emergency Medicine
  • Emergency Nursing
  • Endocrine and Autonomic Systems
  • Endocrinology
  • Endocrinology, Diabetes and Metabolism
  • Energy Engineering and Power Technology
  • Energy (miscellaneous)
  • Engineering (miscellaneous)
  • Environmental Chemistry
  • Environmental Engineering
  • Environmental Science (miscellaneous)
  • Epidemiology
  • Experimental and Cognitive Psychology
  • Family Practice
  • Filtration and Separation
  • Fluid Flow and Transfer Processes
  • Food Animals
  • Food Science
  • Fuel Technology
  • Fundamentals and Skills
  • Gastroenterology
  • Gender Studies
  • Genetics (clinical)
  • Geochemistry and Petrology
  • Geography, Planning and Development
  • Geometry and Topology
  • Geotechnical Engineering and Engineering Geology
  • Geriatrics and Gerontology
  • Gerontology
  • Global and Planetary Change
  • Hardware and Architecture
  • Health Informatics
  • Health Information Management
  • Health Policy
  • Health Professions (miscellaneous)
  • Health (social science)
  • Health, Toxicology and Mutagenesis
  • History and Philosophy of Science
  • Horticulture
  • Human Factors and Ergonomics
  • Human-Computer Interaction
  • Immunology and Allergy
  • Immunology and Microbiology (miscellaneous)
  • Industrial and Manufacturing Engineering
  • Industrial Relations
  • Infectious Diseases
  • Information Systems
  • Information Systems and Management
  • Inorganic Chemistry
  • Insect Science
  • Instrumentation
  • Internal Medicine
  • Issues, Ethics and Legal Aspects
  • Leadership and Management
  • Library and Information Sciences
  • Life-span and Life-course Studies
  • Linguistics and Language
  • Literature and Literary Theory
  • LPN and LVN
  • Management Information Systems
  • Management, Monitoring, Policy and Law
  • Management of Technology and Innovation
  • Management Science and Operations Research
  • Materials Chemistry
  • Materials Science (miscellaneous)
  • Maternity and Midwifery
  • Mathematical Physics
  • Mathematics (miscellaneous)
  • Mechanical Engineering
  • Mechanics of Materials
  • Media Technology
  • Medical and Surgical Nursing
  • Medical Assisting and Transcription
  • Medical Laboratory Technology
  • Medical Terminology
  • Medicine (miscellaneous)
  • Metals and Alloys
  • Microbiology
  • Microbiology (medical)
  • Modeling and Simulation
  • Molecular Biology
  • Molecular Medicine
  • Nanoscience and Nanotechnology
  • Nature and Landscape Conservation
  • Neurology (clinical)
  • Neuropsychology and Physiological Psychology
  • Neuroscience (miscellaneous)
  • Nuclear and High Energy Physics
  • Nuclear Energy and Engineering
  • Numerical Analysis
  • Nurse Assisting
  • Nursing (miscellaneous)
  • Nutrition and Dietetics
  • Obstetrics and Gynecology
  • Occupational Therapy
  • Ocean Engineering
  • Oceanography
  • Oncology (nursing)
  • Ophthalmology
  • Oral Surgery
  • Organic Chemistry
  • Organizational Behavior and Human Resource Management
  • Orthodontics
  • Orthopedics and Sports Medicine
  • Otorhinolaryngology
  • Paleontology
  • Parasitology
  • Pathology and Forensic Medicine
  • Pathophysiology
  • Pediatrics, Perinatology and Child Health
  • Periodontics
  • Pharmaceutical Science
  • Pharmacology
  • Pharmacology (medical)
  • Pharmacology (nursing)
  • Pharmacology, Toxicology and Pharmaceutics (miscellaneous)
  • Physical and Theoretical Chemistry
  • Physical Therapy, Sports Therapy and Rehabilitation
  • Physics and Astronomy (miscellaneous)
  • Physiology (medical)
  • Plant Science
  • Political Science and International Relations
  • Polymers and Plastics
  • Process Chemistry and Technology
  • Psychiatry and Mental Health
  • Psychology (miscellaneous)
  • Public Administration
  • Public Health, Environmental and Occupational Health
  • Pulmonary and Respiratory Medicine
  • Radiological and Ultrasound Technology
  • Radiology, Nuclear Medicine and Imaging
  • Rehabilitation
  • Religious Studies
  • Renewable Energy, Sustainability and the Environment
  • Reproductive Medicine
  • Research and Theory
  • Respiratory Care
  • Review and Exam Preparation
  • Reviews and References (medical)
  • Rheumatology
  • Safety Research
  • Safety, Risk, Reliability and Quality
  • Sensory Systems
  • Signal Processing
  • Small Animals
  • Social Psychology
  • Social Sciences (miscellaneous)
  • Social Work
  • Sociology and Political Science
  • Soil Science
  • Space and Planetary Science
  • Spectroscopy
  • Speech and Hearing
  • Sports Science
  • Statistical and Nonlinear Physics
  • Statistics and Probability
  • Statistics, Probability and Uncertainty
  • Strategy and Management
  • Stratigraphy
  • Structural Biology
  • Surfaces and Interfaces
  • Surfaces, Coatings and Films
  • Theoretical Computer Science
  • Tourism, Leisure and Hospitality Management
  • Transplantation
  • Transportation
  • Urban Studies
  • Veterinary (miscellaneous)
  • Visual Arts and Performing Arts
  • Waste Management and Disposal
  • Water Science and Technology
  • All regions
  • Asiatic Region
  • Eastern Europe
  • Latin America
  • Middle East
  • Northern America
  • Pacific Region
  • Western Europe
  • ARAB COUNTRIES
  • IBEROAMERICA
  • NORDIC COUNTRIES
  • Citable documents

research ranking journals

Follow us on @ScimagoJR Scimago Lab , Copyright 2007-2024. Data Source: Scopus®

research ranking journals

Cookie settings

Cookie Policy

Legal Notice

Privacy Policy

Ranking the Ranker: How to Evaluate Institutions, Researchers, Journals, and Conferences?

  • Open access
  • Published: 16 October 2023
  • Volume 65 , pages 615–621, ( 2023 )

Cite this article

You have full access to this open access article

research ranking journals

  • Wil M. P. van der Aalst 1 ,
  • Oliver Hinz 2 &
  • Christof Weinhardt 3  

2245 Accesses

6 Altmetric

Explore all metrics

Avoid common mistakes on your manuscript.

1 Introduction

How to evaluate institutions, researchers, journals, and conferences? The ranking of scientific research in all its dimensions is food for discussion and the source of major controversies. As editors of the Business & Information Systems Engineering ( BISE ) journal, we want our journal to score excellently in rankings. As individual BISE researchers, we want our research to have a significant impact and see this reflected in rankings. As university employees, we want our university to score well in the global university rankings. Rankings are considered important if one scores well. If one does not score well, then one often finds reasons to downplay the ranking’s importance. Due to the availability of data, it has become easier to generate rankings. Also, scholarly interest in rankings has increased, and “ranking the ranker” has become a vibrant area of study (Hazelkorn 2018 ; Ringel et al. 2021 ; Moed et al. 1985 ; Stolz et al. 2010 ). Rankings also impact individual careers, influence where students want to study, and play a major role in the distribution of research funding.

Although the different types of rankings are widely used, there are also many concerns. The San Francisco Declaration on Research Assessment (DORA) raised concerns related to the “number-based evaluations” of academics (DORA 2012 ). The declaration starts with the statement, “There is a pressing need to improve the ways in which the output of scientific research is evaluated by funding agencies, academic institutions, and other parties”. The DORA declaration also provides 18 recommendations, grouped according to their intended audience: funding agencies, institutions, publishers, organizations that supply metrics, and researchers. The general recommendation is “Do not use journal-based metrics, such as Journal Impact Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions.” (DORA 2012 ). It is hard to disagree with these recommendations, but a decade after the DORA declaration, better mechanisms still seem to be missing.

In some countries and institutions, it is now even forbidden to mention numerical data (like H-index and number of citations) in grant applications. However, reviewers immediately search for the Google Scholar pages of the applicants to get a first impression. Due to the broadness of the different scientific disciplines, it is hard to judge work in a purely qualitative manner. Similarly, it is also close to impossible to make objective tenure decisions that are not based on objective data, such as the number of published papers in different categories, citations, and grants. Completely abandoning numerical data (“bibliometric denialism”) creates uncertainty and may lead to highly subjective and only allegedly “fairer” decisions (e.g., years of hard work are judged based on someone’s presentation skills).

Moreover, we witness fierce international competition to attract both scientific staff and top students. Here, university rankings do play a major role. Therefore, we cannot simply ignore rankings, whether we like them or not. In this editorial, we give an overview of the different types of rankings and discuss their applicability. Figure  1 provides a high-level overview of the three types of rankings considered.

figure 1

The interplay between rankings of institutions, researchers, and outlets (e.g., journals and conferences)

Note that also in science, we can observe the Matthew effect of accumulated advantage. The Matthew principle is also known as “the rich get richer and the poor get poorer” and can be explained by preferential attachment, whereby wealth or credit is distributed among individuals according to how much they already have. This also applies to science. For a highly-ranked university, it is easier to attract excellent researchers, making the university even stronger. For a highly-cited researcher, it is easier to receive research funding, resulting in more PhDs and scientific output. Although the Matthew effect seems unfair, it is also partly inevitable.

There is also a competition between different fields of science. BISE researchers compete with researchers in physics, medicine, energy, engineering, and production. Therefore, it is helpful to understand the different rankings and reflect on them. For example, the databases and rankings by Clarivate have a strong bias toward specific disciplines (e.g., physics) and tend to downplay the impact and volume of BISE research (Ioannidis et al. 2019 ).

2 Ranking Institutions

First, we consider the rankings at the institutional level, i.e., mostly universities. These rankings often also provide a ranking per subject. ShanghaiRanking Consultancy annually publishes the Academic Ranking of World Universities (ARWU) and the Global Ranking of Academic Subjects (GRAS) ( www.shanghairanking.com ). The ARWU ranking is based on the number of alumni and staff winning Nobel prizes and Fields medals, the number of highly cited researchers selected by Clarivate, the number of articles published in journals of Nature and Science , and the number of articles indexed in Science Citation Index Expanded and Social Sciences Citation Index (Web of Science). Note that ARWU heavily relies on Clarivate data, as well as specific awards (e.g., Nobel prizes) and journals (e.g., Nature ). This means that areas such as Computer Science (where conferences are important and there are “only” Turing award winners instead of Nobel prize winners) are undervalued. The GRAS ranking uses 54 subjects, including Computer Science and Engineering, Economics, Business Administration, and Management.

Times Higher Education (THE) annually publishes THE World University Ranking and THE World University Ranking by Subject ( www.timeshighereducation.com ). These rankings use Elsevier’s Scopus database. Citations account for 30% of the score. Other elements include student-to-staff ratios, reputation, research income, and proportion of international students. There are 11 subject rankings. Most relevant for BISE are Business and Economics, and Computer Science.

Quacquarelli Symonds (QS) publishes the QS World University Ranking and the QS World University Ranking by Subject ( www.topuniversities.com ). Like THE, QS also uses Elsevier’s Scopus database. Citations account for only 20% of the score. Academic reputation accounts for 40%. Other criteria are international student ratio, international faculty ratio, faculty-to-student ratio, and employer reputation. The QS World University Ranking by Subject covers a total of 54 disciplines, grouped into five broad subject areas. Most relevant for BISE are Computer Science and Information Systems, Data Science, Business and Management, and Economics and Econometrics.

As Fig.  1 shows, there are many other university rankings. For example, US News and World Report produces the Best Global University Ranking and the Best Global Universities Subject Ranking ( www.usnews.com/rankings ). The Centre for Science and Technology Studies (CWTS) in Leiden publishes the CWTS Leiden Ranking and CWTS Leiden Ranking by Field ( www.leidenranking.com ). SCImago Lab publishes the SCImago Institutions Ranking ( www.scimagoir.com ), and Research.com publishes the Best University Ranking (research.com). Note that the latter ranking is only provided per subject category and is solely based on researchers with a high Hirsch index.

All of these rankings use different methodologies. Some focus more on scientific output, others more on reputation. Some are more forward-looking, and others are more backward-looking. Therefore, there are differences, but these tend to be smaller than expected (especially for the top 100). Indicators often seem to be selected due to their availability. Also, some measures are size-dependent, making it impossible for smaller or specialized universities to achieve a high overall ranking.

When it comes to research output, the sum of the research outputs of the institution’s researchers matters. When it comes to reputation, both current staff and earlier students and staff matter. This shows that hiring and retaining the best researchers is vital for universities. Due to the Matthew effect, this leads to a further concentration of talent.

3 Ranking Researchers

Next, we consider the rankings at the individual level. These rankings are often seen as controversial (Van der Aalst 2022 ). Whereas university rankings generate revenue through advertisements (and are therefore managed in a professional manner), individual researcher rankings tend to be informal or a side-product of some other service. There are many rankings in specific subfields, e.g., in economics. There are only a few that cover all disciplines. Research.com publishes the Best Scientists Ranking by Field. This ranking is based on a scholar’s D-index (Discipline H-index), which takes into account only publications and citation metrics for an examined discipline. The fields Business and Management, as well as Computer Science are most relevant for BISE . The Alper-Doger (AD) Scientific Index publishes the World Scientists Rankings by Subject ( www.adscientificindex.com ), which is based on the total and last five years’ values of the i10 index, H-index, and citation scores in Google Scholar. Clarivate maintains a list of Highly Cited Researchers based on the Web of Science (clarivate.com/highly-cited-researchers). Finally, Elsevier Scopus provides several author metrics that can be used to create rankings easily.

The easiest way to evaluate productivity and impact is to simply count the number of published papers and the number of citations. Clearly, this is very naïve because it is possible to publish many papers that are incremental or of low quality. Counting the total number of citations is also problematic because a researcher may be an “accidental co-author” of a high-cited paper. This does not say much about the contribution of the author, and citations tend to follow a power-law distribution (i.e., just a few papers attract most of the citations). To address the limitations of simply counting papers and citations, the scientific community has created journal and conference rankings, and metrics like the well-known Hirsch index. This H-index was first proposed by Jorge E. Hirsh in 2005 and adapted in many different ways (Harzing and Alakangas 2016 ).

The DORA declaration mentioned before advocates not using such measures (DORA 2012 ). In the Netherlands, the “Recognition and Rewards” (“Erkennen en Waarderen”) program (NWO 2019 ) was initiated to improve the evaluation of academics and to give credits to people working in teams or focusing on teaching. Similar initiatives can be seen in other countries and at the European level (COARA 2022 ). Although the goals of such programs are reasonable, and it is impossible to disagree with statements such as “quality is more important than quantity” and “one should recognize and value team performance and interdisciplinary research”, suitable measures are lacking. Such initiatives are often used to dismiss any attempt to quantify and evaluate productivity and impact. In some universities, it has even become “politically incorrect” to talk about published papers and the number of citations. In Torres-Salinas et al. ( 2023 ), this phenomenon is described as “Bibliometric denialism” and an incorrect interpretation of the DORA declaration, which primarily focused on abuse and misuse of the Journal Impact Factor (JIF). When evaluating and selecting academics, committee members typically still secretly look at the data provided by Google Scholar, Scopus, and Web of Science. This is because it is challenging to evaluate and compare academic performance in an objective and qualitative way. In fact, not using quantitative data creates the risk that evaluations and selections become highly subjective, e.g., based on taste, personal preferences, and criteria not known to the individuals evaluated. Moreover, in such processes, quantitative data are often still used, but in an implicit, secretive, and inconsistent manner.

Therefore, despite all the problems, we often still need to resort to data-driven approaches to evaluate productivity and impact. Of course, quantitative measures should only support expert assessment and are not a substitute for informed judgment. When using citation scores, one should definitely consider the “Leiden Manifesto for research metrics” (Hicks et al. 2015 ), which provides ten principles to guide research evaluations.

As elaborated in Sect.  4 , it is also not easy to rank outlets (journals, conferences, workshops, etc.). Therefore, in this section, we confine ourselves to counting output and impact in terms of citations. There are multiple databases that can be used to evaluate productivity and impact, e.g., Elsevier’s Scopus and Google Scholar (both released in 2004) and Web of Science (online since 2002). Also, dedicated tools running on top of these platforms, such as InCites (using the Web of Science) and SciVal (using Scopus), have been developed. Web of Science has a strong focus on journals published in the US and favors traditional disciplines such as physics. Conferences are only partially covered. For a BISE researcher, the number of citations in Google Scholar may be twice the number of citations in Scopus, and over eight times the number of citations in Web of Science. For a researcher in Physics, the differences between Google Scholar, Scopus, and Web of Science tend to be much smaller. This means that the Web of Science should not be used for underrepresented disciplines like BISE . Google Scholar has the most extensive coverage, but also data quality problems. Google Scholar simply crawls academic-related websites and also counts non-peer-reviewed documents. One may also find stray citations where minor variations in referencing lead to duplicate records for the same paper (Harzing and Alakangas 2016 ). Also, the output of different authors may be merged into one user profile. Also Scopus and Web of Science have such problems, but to a lesser degree. These examples illustrate that the impact of data quality problems and limited coverage are not equally distributed. Considering data quality and coverage, Scopus can be seen as the “middle road” when counting publications and citations (Baas et al. 2020 ; Harzing and Alakangas 2016 ; Van der Aalst 2022 ).

Another complication is that there are different publication traditions that significantly impact the most common measures used today. In many disciplines, the average number of authors is around two. However, in areas like physics, the average is above ten authors, and there are papers with hundreds or even thousands of authors. An article on measuring the Higgs Boson Mass published in Physical Review Letters has 5,154 authors (Aad et al. 2015 ). This 33-page article has 24 pages to list the authors, and only nine pages are devoted to the actual paper. When counting H-indices in the standard way, this paper will increase the H-index by one for more than 5,000 authors. Also, the order in which authors are listed varies from discipline to discipline. In mathematics, it is common to list authors alphabetically. In other fields, the order is based on contribution. Also, the “last author” position may have a specific meaning (e.g., the project leader or most senior researcher). In Computer Science, conference publications are regarded as important and comparable to journal publications. In other areas, conference publications “do not count”, and all work is published in journals. The above shows that counting just journal papers while ignoring the number of authors may have hugely diverging consequences for different disciplines.

An interesting approach to address some of these concerns was proposed by John Ioannidis and his colleagues (Ioannidis 2022 ; Ioannidis et al. 2016 , 2019 , 2020 ). They propose to use a composite indicator (called C-score ), which is the sum of the standardized six log-transformed citation indicators (NC, H, Hm, NS, NSF, NSFL):

the total number of citations received (NC),

the Hirsch index for the citations received (H),

the Schreiber co-authorship adjusted Hm index for the citations received (Hm).

the total number of citations received to papers for which the scientist is single author (NCS),

the total number of citations received to papers for which the scientist is single or first author (NCSF), and

the total number of citations received to papers for which the scientist is single, first, or last author (NCSFL).

For a detailed explanation of these indicators, we refer to Ioannidis et al. ( 2016 ) and Ioannidis et al. ( 2019 ). The resulting C-score focuses on impact (citations) rather than productivity (number of publications) and incorporates information on co-authorship and author positions (single, first, last author). Each NC, H, Hm, NS, NSF, NSFL score is normalized to a value between 0 and 1, and these are summed up. Hence, the C-score has a range between 0 and 6. In the dataset (Ioannidis 2022 ), data for 194,983 scientists are reported. The selection is based on the top 100,000 scientists by C-score (with and without self-citations) or a percentile rank of 2% or above in the subfield. The researchers are classified into 22 scientific fields and 174 sub-fields. The dataset is based on all Scopus author profiles as of September 1, 2022, because Scopus can be seen as the middle ground between Google Scholar and Web of Science.

Currently, the C-score seems to be the best way to measure the impact of an author based on her publications. Although the C-score definitely has its limitations and only paints a one-dimensional picture, it removes some of the biases and creates a level playing field when quantifying scientific impact.

4 Ranking Outlets (Journals, Conferences, Etc.)

Researchers produce artifacts such as papers, datasets, prototypes, and software. For software and datasets, one can measure the number of downloads. This can also be done for papers. Downloads and citations are definitely indicators of impact. However, the impact of an artifact can only be measured after some time. This delay complicates decision-making. When a paper is published, it is unclear what impact it will have in five or ten years. Similarly, it is hard to judge the future impact of a PhD thesis for people not directly involved. The PhD student may have left academia before there is “bibliometric evidence” that the thesis realized major breakthroughs. Due to this delay, it is tempting to assign value to the “outlet” of a paper (e.g., journal, conference, or workshop). A paper published in Science or Nature is expected to have more impact than a paper published in some informal workshop proceedings. A paper accepted for a conference with an acceptance rate of 10% is expected to have more impact and higher quality than a paper accepted for a conference with an acceptance rate of 90%. Therefore, there is a desire to “rank outlets”. This has the advantage that one can assign “value” to a paper the moment it is accepted and remove the delay mentioned before. This results in ranked lists of journals and conferences.

However, focused lists of journals and conferences tend to have a topical or geographical bias. For example, in the field of Information Systems (IS), the “College of Senior Scholars” selected a “basket” of journals as the top journals in their field. The goal was to address the problem that few “Information Systems” (IS) journals were widely considered elite-level journals in tenure and promotion cases. However, looking at the selected journals, the field of IS was interpreted in a particular manner. In Europe, IS also includes more technical subjects (e.g., building prototype systems, developing algorithms, and using formal reasoning). This side of IS is not well-represented in the current basket. Some universities create their own local journal lists for specific areas and use these for tenure decisions. This heavily influences the research conducted by young researchers. The CORE ranking of conferences (CORE 2023 ) is much broader, but has similar problems (e.g., the ranking was established by a few computer departments in Australia and New Zealand and is now used all over the globe to decide on research funding and travel budgets). The intentions behind these lists are good. However, it is unavoidable that there are topical biases and scoping issues. Moreover, such rankings are like a self-fulfilling prophecy. This again leads to a variant of the Matthew effect, i.e., the higher the ranking of a conference or journal, the more people want to submit to it, which automatically improves its status. This, combined with a narrow focus, leads to a degenerate view of research quality and discourages innovations in new directions. Although research is changing rapidly, these journal lists tend to be relatively stable. Also, the editorial boards of these journals aim for a particular type of papers. Excellent, highly innovative papers may be rejected due to scope issues and end up in lower-ranked journals. As a result, young researchers are encouraged to write “what is expected” rather than exploring new research directions.

To avoid subjectivity in ranking journals and conferences, one can use quantitative measures based on citations. Instead of evaluating a researcher, one now evaluates the work published by a journal or conference in a given time period. Figure  1 shows some of the journal and conference rankings.

Well-known metrics based on Elsevier’s Scopus are CiteScore, SNIP (Source Normalized Impact per Paper), and SJR (SCImago Journal Rank) (Roldan-Valadez et al. 2019 ). Well-known metrics based on Clarivate ‘s Web of Science are JIF (Journal Impact Factor) and 5yIF (Five-year Impact Factor). Google is used to compute the H5 Index. To understand how such metrics are computed, let us consider the way CiteScore, JIF, and H5 are computed for BISE for 2023. The CiteScore for 2023 is the number of citations in Scopus to BISE papers published in 2000, 2021, 2022, and 2023 (four years) divided by the number of papers published by BISE in the same period. The JIF (Journal Impact Factor) for 2023 is the number of citations to BISE papers published in 2021 and 2022 by Web of Science papers in 2023 divided by the number of BISE papers published in 2021 and 2022. The H5 score is the Hirsch index for articles published in the last five years. For 2023, the H5-index for BISE is the largest number X, such that X articles published in BISE in 2018–2022 have received at least X citations each (using Google Scholar). As can be noted, the intent of these measures is similar: Measuring impact based on citations. However, the underlying data sources and time scales are different.

The San Francisco Declaration on Research Assessment (DORA 2012 ) movement was triggered by the obsession of the scientific community with the JIF. Even for journals with astronomical impact factors, the citations of individual papers vary widely. As shown in (Schmid 2018 ), the average number of citations of the top 10% and bottom 10% of papers published in Nature shows a 20-fold difference. Hence, it is odd to judge a paper based on the JIF of the journal that happened to publish it. Just looking at the outlet itself is not enough to evaluate the quality, novelty, and impact of the work. This was the main trigger for the DORA movement. Unfortunately, this also resulted in widespread “bibliometric denialism”, as described in (Torres-Salinas et al. 2023 ). Peer review and qualitative judgment are difficult to implement and tend to be subjective. Therefore, completely denying quantitative indicators based on bibliometric data seems counterproductive.

5 Implications

As expected, we were not able to answer the question “How to evaluate institutions, researchers, journals, and conferences?” in a satisfactory manner. However, by posing the question and providing an overview of the different types of rankings, we hope to trigger a discussion about what these rankings mean for the BISE community. Although these rankings have many limitations and measure what can be measured rather than what should be measured, they remain highly relevant for BISE researchers. We often use the phrase “you get what you measure” to indicate that rankings influence the behavior of students and researchers. It may also explain why particular types of research are conducted in particular countries. In countries with a focus on publishing in a few top-journals that enforce specific research methods, certain types of research cannot flourish. For example, in Computer Science and Europe, there is a stronger focus on conference publications. In Management Science and in the US, there is a stronger focus on journal publications. Academics working on “Information Systems” (IS) in the US tend to work on rather different things than academics working on IS in Europe, e.g., US-based IS researchers tend to have a more social-sciences focus, and European IS researchers tend to work on more technical and conceptual topics. This may explain why Business Process Management (BPM) research thrives in Europe and parts of Asia (e.g., Australia), but is almost non-existent in the US. Of course, this is not just due to rankings; also, cultural aspects play a significant role. However, for BISE researchers, it is good and important to reflect on all these phenomena.

Aad G et al (2015) Combined measurement of the Higgs Boson Mass in pp collisions at √s= 7 and 8 TeV with the ATLAS and CMS experiments. Phys Rev Lett 114(19):191803. https://doi.org/10.1103/PhysRevLett.114.191803

Article   Google Scholar  

Baas J, Schotten M, Plume A, Côté G, Karimi R (2020) Scopus as a curated, high-quality bibliometric data source for academic research in quantitative science studies. Quant Sci Stud 1(1):377–386. https://doi.org/10.1162/qss_a_00019

COARA (2022) Agreement on reforming research assessment. https://coara.eu/ . Accessed 29 August 2023

CORE (2023) CORE rankings portal. https://www.core.edu.au/ . Accessed 29 August 2023

DORA (2012) San Francisco declaration on research assessment (DORA). https://sfdora.org/ . Accessed 29 August 2023

Hazelkorn E (2018) Reshaping the world order of higher education: the role and impact of rankings on national and global systems. Policy Rev High Educ 2(1):4–31. https://doi.org/10.1080/23322969.2018.1424562

Hicks D, Wouters P, Waltman L, de Rijcke S, Rafols I (2015) Bibliometrics: the Leiden Manifesto for research metrics. Nature 520:429–431. https://doi.org/10.1038/520429a

Ioannidis J (2022) September 2022 data-update for “Updated science-wide author databases of standardized citation indicators. Elsevier Data Repos. https://doi.org/10.17632/btchxktzyw.5

Ioannidis J, Klavans R, Boyack K (2016) Correction: multiple citation indicators and their composite across scientific disciplines. PLOS Biol 14(8):e1002548. https://doi.org/10.1371/journal.pbio.1002548

Ioannidis J, Baas J, Klavans R, Boyack K (2019) A standardized citation metrics author database annotated for scientific field. PLoS Biol 17(8):e3000384. https://doi.org/10.1371/journal.pbio.3000384

Ioannidis J, Boyack K, Baas J (2020) Updated science-wide author databases of standardized citation indicators. PLoS Biol 18(10):e3000918. https://doi.org/10.1371/journal.pbio.3000918

Moed HF, Burger WJM, Frankfort GJ, Van Raan A (1985) The use of bibliometric data for the measurement of university research performance. Res Policy 14(3):131–149. https://doi.org/10.1016/0048-7333(85)90012-5

NWO (2019) Recognition and Rewards (“Erkennen en Waarderen”) program, an initiative by VSNU, NFU, KNAW, NWO and ZonMw. https://recognitionrewards.nl/ . Accessed 29 August 2023

Roldan-Valadez E, Salazar-Ruiz SY, Ibarra-Contreras R, Rios C (2019) Current concepts on bibliometrics: a brief review about impact factor, Eigenfactor score, CiteScore, SCImago Journal Rank, Source-Normalised Impact per Paper, H-index, and alternative metrics. Ir J Med Sci 188:939–951. https://doi.org/10.1007/s11845-018-1936-5

Harzing AW, Alakangas S (2016) Google Scholar, Scopus and the Web of Science: a longitudinal and cross-disciplinary comparison. Scientometrics 106:787–804. https://doi.org/10.1007/s11192-015-1798-9

Ringel L, Espeland W, Sauder M, Werron T (2021) Worlds of rankings. In: Ringel L et al (eds) Worlds of rankings (research in the sociology of organizations). Bingley, Emerald

Google Scholar  

Schmid SL (2018) Five years post-DORA: promoting best practices for research assessment. Mol Biol Cell 28(22):2941–2944. https://doi.org/10.1091/mbc.E17-08-0534

Stolz I, Hendel DD, Horn AS (2010) Ranking of rankings: benchmarking twenty-five higher education ranking systems in Europe. High Educ 60:507–528. https://doi.org/10.1007/s10734-010-9312-z

Torres-Salinas D, Arroyo-Machado W, Robinson-Garcia N (2023) Bibliometric denialism. Scientometrics 128:5357–5359. https://doi.org/10.1007/s11192-023-04787-2

Van der Aalst WMP (2022) Yet another view on citation scores. LinkedIn Pulse. https://www.linkedin.com/pulse/yet-another-view-citation-scores-wil-van-der-aalst . Accessed 29 August 2023

Download references

Open Access funding enabled and organized by Projekt DEAL.

Author information

Authors and affiliations.

Lehrstuhl für Informatik 9, RWTH Aachen, Ahornstr. 55, 52056, Aachen, Germany

Wil M. P. van der Aalst

Faculty of Economics and Business Administration, Goethe University Frankfurt, Theodor-W.-Adorno-Platz 4, 60323, Frankfurt am Main, Germany

Oliver Hinz

Institute of Information Systems and Marketing (IISM), Karlsruhe Institute of Technology (KIT), Kaiserstr. 89-93, 76133, Karlsruhe, Germany

Christof Weinhardt

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Wil M. P. van der Aalst .

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

van der Aalst, W.M.P., Hinz, O. & Weinhardt, C. Ranking the Ranker: How to Evaluate Institutions, Researchers, Journals, and Conferences?. Bus Inf Syst Eng 65 , 615–621 (2023). https://doi.org/10.1007/s12599-023-00836-5

Download citation

Published : 16 October 2023

Issue Date : December 2023

DOI : https://doi.org/10.1007/s12599-023-00836-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Find a journal
  • Publish with us
  • Track your research
  • Work & Careers
  • Life & Arts

50 Journals used in FT Research Rank

  • 50 Journals used in FT Research Rank on x (opens in a new window)
  • 50 Journals used in FT Research Rank on facebook (opens in a new window)
  • 50 Journals used in FT Research Rank on linkedin (opens in a new window)
  • 50 Journals used in FT Research Rank on whatsapp (opens in a new window)

By Laurent Ormans

Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.

The Financial Times conducted a review in May 2016 of the journals that count towards its research rank. As a result, the number of journals considered went up to 50 compared to 45 previously. The 200 odd business schools that take part in either the FT Global MBA, Executive MBA or Online MBA rankings were invited to submit up to five new journals to include and five journals to exclude from the previous list. A total of 140 schools submitted their votes, a response rate of 67 per cent. Out of the 10 selected journals up for review, we decided to drop the four journals that each received 60 per cent or more of the votes: Academy of Management Perspectives, California Management Review, Journal of the American Statistical Association and RAND Journal of Economics. Out of the 150 new journals suggested, the nine journals (*) with the most votes were added to the list.

The list below details the 50 journals used by the Financial Times in compiling the FT Research rank, included in the Global MBA , EMBA and Online MBA rankings.

1. Academy of Management Journal

2. Academy of Management Review

3. Accounting, Organizations and Society

4. Administrative Science Quarterly

5. American Economic Review

6. Contemporary Accounting Research

7. Econometrica

8. Entrepreneurship Theory and Practice

9. Harvard Business Review

10. Human Relations*

11. Human Resource Management

12. Information Systems Research

13. Journal of Accounting and Economics

14. Journal of Accounting Research

15. Journal of Applied Psychology

16. Journal of Business Ethics

17. Journal of Business Venturing

18. Journal of Consumer Psychology

19. Journal of Consumer Research

20. Journal of Finance

21. Journal of Financial and Quantitative Analysis

22. Journal of Financial Economics

23. Journal of International Business Studies

24. Journal of Management*

25. Journal of Management Information Systems*

26. Journal of Management Studies

27. Journal of Marketing

28. Journal of Marketing Research

29. Journal of Operations Management

30. Journal of Political Economy

31. Journal of the Academy of Marketing Science*

32. Management Science

33. Manufacturing and Service Operations Management*

34. Marketing Science

35. MIS Quarterly

36. Operations Research

37. Organization Science

38. Organization Studies

39. Organizational Behavior and Human Decision Processes

40. Production and Operations Management

41. Quarterly Journal of Economics

42. Research Policy*

43. Review of Accounting Studies

44. Review of Economic Studies*

45. Review of Finance*

46. Review of Financial Studies

47. Sloan Management Review

48. Strategic Entrepreneurship Journal*

49. Strategic Management Journal

50. The Accounting Review

Promoted Content

Follow the topics in this article.

  • Online MBA Add to myFT
  • University of Chicago Add to myFT
  • Harvard University Add to myFT
  • Massachusetts Institute of Technology Add to myFT
  • Baylor University Add to myFT

International Edition

Emergency -

Usf libraries hours by campus, search the usf libraries, libraries locations.

  • Libraries Hours
  • Outages & Maintenance Alerts

RESEARCH TOOLS

  • Subject & Course Guides
  • USF Libraries Catalog
  • Quicksearch All-in-one-search
  • Citing Sources
  • Find my Librarian
  • Interlibrary Loan (ILL)

GUIDES / HOW-TO

  • Tutorials & Workshops
  • Finding Books and Articles
  • Finding Reserves
  • Checking Out & Renewing
  • Reserve a Study Room
  • Additional Help Topics
  • star Other Services
  • For Faculty
  • For Graduate Students
  • For Undergrads
  • Requesting Books & Articles (ILL)
  • Textbook Affordability (TAP)
  • Library Instruction
  • Laptop Checkout
  • Schedule Research Help
  • Geographic Information Systems
  • Data Management Planning
  • Copyright & Intellectual Property
  • Scholarly Publishing
  • Other Services

COLLECTIONS

  • What are Collections?
  • Special Collections
  • Digital Collections
  • Digital Heritage & Humanities
  • Digital Commons @ USF
  • Oral Histories
  • Online Exhibitions
  • Printing in the Library
  • IT Help Desk
  • Digital Media Commons (DMC)
  • Writing Studio
  • Office of Development
  • Office for Undergraduate Research
  • Filming & Photography in USF Libraries
  • Library Info & Floor Maps
  • Connect From Off Campus
  • Renew Materials Online
  • Check UBorrow Status
  • Printing Help
  • Report a Problem
  • About the USF Libraries

Finding the Right Journal in Which to Publish Your Research

This website uses cookies to ensure you get the best experience on our website.

  • Subscribe to RSS

research ranking journals

  • in Research

Ranking Journals: Academic Journal Guide 2021 (“ABS List”)

The influential Chartered Association of Business Scholars has just published its Academic Journal Guide 2021 (“ABS list”). What does this mean for the operations and supply chain management (OSCM) research community? I have looked at the ranks for 15 major OSCM journals.

Once again, only the following OSCM journals were classified in category 4*: Journal of Operations Management , Management Science and Operations Research . To be honest, I wonder if the asterisk is really still appropriate for Operations Research .

The following OSCM journals were given a 4 in the 2021 ABS list: European Journal of Operational Research , International Journal of Operations & Production Management , Journal of Supply Chain Management and Production & Operations Management . This is good news for our discipline, because it means that the Journal of Supply Chain Management has moved up into this important category. However, it would have been time to give this journal not just a 4, but a 4*.

The following OSCM journals were given a grade of 3: Decision Sciences , International Journal of Production Economics , Journal of Business Logistics , Journal of Purchasing & Supply Management , Manufacturing & Service Operations Management and Supply Chain Management: An International Journal . Here, there are even two new entries: Journal of Business Logistics and Journal of Purchasing & Supply Management . While this is good news for both logistics and procurement scholars, I would have expected Journal of Business Logistics and Manufacturing & Service Operations Management to rank even higher.

The International Journal of Physical Distribution & Logistics Management lands at a 2 and the International Journal of Logistics Management lands at a 1 again. The low scores for these two journals in the 2021 ABS list are particularly strange given their quality.

My conclusion: The team behind the 2021 Academic Journal Guide appears to have listened – at least partly – to the harsh criticism from OSCM scholars. Although our discipline is certainly still underrated, compared to many other disciplines, there are finally some bright spots that give OSCM researchers a little more air to breathe. Empirically-focused OSCM journals were particularly disadvantaged by the ABS list in the past and three of them have now been upgraded. This step was overdue.

This slightly positive development for our discipline should not hide the harmful effects of rankings in general . Academia is increasingly about metrics rather than content. Assessment committees, bonus decisions and tenure-track regulations are increasingly about counting the names of certain journals per year – instead of reading them. The REF system in the UK has turned academic debate (i.e., quality) into a “race for points” (i.e., quantity) in the home country of the ABS list. This is a very negative development.

Therefore, it is gratifying that more and more universities are signing the San Francisco Declaration on Research Assessment (DORA), which asks “not [to] use journal-based metrics […] as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions”.

Tags: Ranking

' src=

About Andreas Wieland

5 responses to “ranking journals: academic journal guide 2021 (“abs list”)”.

' src=

Rightly said, few bright spots and some interesting misses too…. IJPDLM is certainly one of them. I am also of the view that the grading and ranking should be based on the intellectual merit rather than the publications in ranked journals

' src=

Your summary of the rankings is right on, Andreas. Thank you. It is good to see come upward movement on the rankings of our best journals. IJPDLM an IJLM are puzzling– and each should have moved up at least one position. JBL and JPSM should have both moved up more– but think this committee does not want to move any journal more than one position.

' src=

I have no idea why EJOR is ranked a “4”. Should be closer to a “2”. JBL is also top tier. Good points with IJPDLM and IJLM. Fully agree with have a flawed system in evaluating research impact. It is only getting worse.

' src=

A quick look at the editorial board of M&SOM would convince that it is a top journal – editors come for all top US schools. Sad to see Universities in many European countries still do not consider it a top outlet. Also surprised to see EJOR in the top. My two cents, of course…

The editorial board of M&SOM comprises top researchers from top US Schools. How it can be considered to be not a top journal is beyond me…

Leave a Reply Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed .

Subscribe via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Email Address

Recent Posts

  • The Supply Chain: A System in Crisis
  • Theory as a Camera or Theory as an Engine
  • Corporate Sustainability Due Diligence Directive (CSDDD)
  • Global Supply Chains Amplify Costs of Extreme Heat Risk
  • A Guided Tour Through the Qualitative Research City
  • Methodology
  • January 2024
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • February 2022
  • December 2021
  • October 2021
  • September 2021
  • August 2021
  • February 2021
  • January 2021
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • February 2020
  • January 2020
  • November 2019
  • October 2019
  • September 2019
  • August 2019
  • February 2019
  • January 2019
  • December 2018
  • November 2018
  • October 2018
  • September 2018
  • August 2018
  • February 2018
  • January 2018
  • December 2017
  • November 2017
  • October 2017
  • September 2017
  • August 2017
  • February 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • September 2016
  • August 2016
  • February 2016
  • January 2016
  • December 2015
  • November 2015
  • October 2015
  • September 2015
  • August 2015
  • February 2015
  • January 2015
  • December 2014
  • November 2014
  • October 2014
  • September 2014
  • August 2014
  • February 2014
  • January 2014
  • December 2013
  • November 2013
  • October 2013
  • September 2013
  • August 2013
  • February 2013
  • January 2013
  • December 2012
  • November 2012
  • October 2012
  • September 2012
  • August 2012
  • February 2012
  • January 2012
  • December 2011
  • November 2011
  • October 2011
  • September 2011
  • August 2011
  • Better Operations
  • BVL's SCM Blog
  • Cottrill Research
  • Global Supply Chain Blog
  • Inventory & Supply Chain Optimization
  • Jay, Barry, & Chuck's OM Blog
  • O.R. by the Beach
  • Punk Rock Operations Research
  • Purchasing Insight
  • SCM-blog.de (de)
  • Supply Chain @ MIT
  • Supply Chain and Logistics Blog
  • Supply Chain Matters
  • Supply Chain View from the Field
  • Supply Chains Rock
  • SupplyChainMusings.com
  • SupplyChainNetwork.com
  • SupplyChainOpz
  • The Freightos Blog
  • The Logistics Blog by Alan McKinnon
  • The Operations Room
  • WFP Logistics Blog

Supply Chain Management Research

Powered by WordPress.com .

Discover more from Supply Chain Management Research

Subscribe now to keep reading and get access to the full archive.

Type your email…

Continue reading

AIS Research Rankings

--> research rankings.

research ranking journals

This Software Service tracks publications in 8 leading IS journals in the AIS Senior Scholars’ basket of journals. At present, the database includes data for the years 1990-2023. This Software Service is provided for free for 2024.  Please read the end-user licensing agreement for details governing the use of this Software Service.

Rankings Feature Highlights

  • University rankings by different regions (Americas, Europe/Africa, Asia/Australia)
  • University rankings using 2 different rankings methods
  • University rankings using various combinations of journals
  • University rankings using various time windows
  • Author rankings using 4 different rankings methods
  • Top-200 rankings
  • Secure HTTPS (SSL) website
  • Data from 1990 to 2023

The current service is based on the AIS Senior Scholars’ basket of eight journals. We are aware that 3 journals have been added to the list of journals. We are expecting that next year’s data will include these newly added journals, with data from 2024 moving forward.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • NATURE INDEX
  • 15 June 2023
  • Correction 22 June 2023

Nature Index Annual Tables 2023: first health-science ranking reveals big US lead

  • Bianca Nogrady

You can also search for this author in PubMed   Google Scholar

Dr. Benjamin Jin, a biologist working on immunotherapy for HPV+ cancers, works in a lab at the National Institutes of Health.

Biologist Benjamin Jin works on immunotherapy for human papillomavirus-positive cancers at the US National Cancer Institute in Bethesda, Maryland. The United States is the most prolific country in health-sciences research output in the Nature Index. Credit: Saul Loeb/AFP via Getty

The United States dominates global health-sciences publishing in the Nature Index Annual Tables 2023, the first to track output in high-quality medical journals. Major government and industry investment has cemented the country’s status as the world leader in health-sciences output. Its closest competitor, China, overtook the United States in natural-sciences output in 2022 .

The Annual Tables rank nations, territories and institutions according to their Share , a metric that tracks the proportion of authors from an institution or region on each paper published in a year in the journals tracked in the Nature Index. The inclusion of 64 medical journals in this year’s tables adds 9,200 articles to the database for 2022 and allows publication output to be tracked across the health sciences as well as four existing natural-science categories (physical sciences, chemistry, Earth and environmental sciences and biological sciences, formerly referred to as life sciences).

In this first ranking of nations by health-sciences output , the US Share was 5,352, well above that of China, at 1,287, and the United Kingdom, at 963 (see ‘Leagues apart’). In the natural sciences, China leads with a Share of 19,373 and the United States is second, with 17,610.

research ranking journals

Source: Nature Index

Health-science research is a major focus of federal spending in the United States, says Carol Robbins, a senior resources analyst at the US National Science Foundation’s National Center for Science and Engineering Statistics, in Alexandria, Virginia. “Federal funding for health-related research and development is almost as high as funding for defence-related research and development,” she says. The US National Institutes of Health (NIH) alone invested around US$42 billion in health and medical research in 2022, and is hoping to increase its budget to $51.1 billion in 2024. The pharmaceutical sector in the United States also spends big: in 2019, it invested an estimated $83 billion in research and development.

Health-research spending is seen as a vote-winner, says Jonathan Adams, chief scientist at the Institute for Scientific Information in London, the research arm of analytics firm Clarivate, which in April published a report on US research trends over the past 15 years . This found that national research funding prioritized the NIH over other government departments and agencies. Almost half of the US civilian research and development budget now goes into the NIH’s coffers, the report notes.

“There is a lot of money going into health — and the research that underpins” it, says Adams. He adds that health-research funding is a popular talking point for politicians on the campaign trail. Among the leading countries in health-sciences research, the United States outspends its closet competitors by a large margin (see ‘Healthy investment’).

research ranking journals

Source: WHO

The United States has a long history of investment and success in the medical sciences. Nearly one-quarter of its 406 Nobel prizes have been awarded for work in physiology or medicine. “If you’re good at something, and you have people who are winning Nobel prizes, and facilities that are recognized globally, then they get backed further because you can put forward some pretty coherent arguments about why you should get further investment,” Adams says.

But such strength in the health sciences doesn’t mean that the United States will always dominate. Adams says that China and India are likely to challenge the US lead in future. “That side of the Chinese research economy is going to expand,” he says.

Small nation strength

In terms of population size, the Netherlands is the smallest country to make it into the top ten in the health-science category. With roughly 18 million people, it has an outsized impact, ranking eighth in the Nature Index Annual Tables 2023, with a Share of 358 — above Japan and Italy.

Dutch institutional spending on medical research and development has leapt forward in recent years, from €67 million (US$72 million today) in 2019 to €235 million in 2020. This is a relatively large proportion of the country’s overall research budget, compared with other nations. The Netherlands ranks fourth globally in the number of patent applications for medical technology, sixth for biotechnology patents and eighth for pharmaceutical patents.

Prolific institution

At an institutional level, the United States dominates, taking eight of the leading ten positions in the 2023 Annual Tables for health sciences (see ‘Show of strength’) The University of Toronto in Canada is one exception, in third position after Harvard University in Cambridge, Massachusetts, and the NIH.

research ranking journals

Leah Cowen, vice-president of research, innovation and strategic initiatives, says that several factors contribute to the University of Toronto’s success. One is a strong emphasis on collaboration; for example, 14 research hospitals belong to the Toronto Academic Health Science Network, which brings researchers and clinicians together to develop and test treatments. This year, University of Toronto researchers partnered with clinician scientists at the Hospital for Sick Children in Toronto to explore the use of magnetically guided robotic nanoscalpels to target the cancer cells in a type of brain tumour called glioblastoma. Another field that has benefited from collaboration is regenerative medicine: the University of Toronto has long established its legacy through pioneering work on stem cells, Cowen says.

The university has also fostered initiatives focused on ‘grand questions’ that bring together researchers from diverse disciplines to tackle subjects such as heart failure, personalized medicine, ageing and the role of cell organelles called mitochondria in human health.

Nature Index Annual Tables 2023: China tops natural-science table

“We’re really committed to engaging on the full spectrum of research, all the way from state-of-the-art pioneering fundamental research through to clinical research, knowledge translation, clinical trials, drug discovery and bio-innovation,” Cowen says. And that’s not limited to faculty members; she says the university encourages students to explore commercial opportunities and entrepreneurship, as well.

Cowen says Toronto has won substantial government grants, such as Can$200 million (US$147 million) from the Canada First Research Excellence Fund for the university’s Acceleration Consortium, which is exploring the use of artificial intelligence and robotics in drug discovery and new materials design. However, the outlook for funding in health-science research is not as promising in Canada as it is in the United States . In real terms, Cowen says, overall federal funding for health-science research has not kept pace with inflation.

There is plenty of advocacy for improving Canadian funding for health-science research, says Cowen. “Despite limited investment, we’re really punching above our weight,” she says. “We’re achieving exceptional impact, so further investment would yield extraordinary additional reward.”

doi: https://doi.org/10.1038/d41586-023-01867-4

Updates & Corrections

Correction 22 June 2023 : An earlier version of this story erroneously stated that there are 15 research hospitals in the Toronto Academic Health Science Network. In fact, there are 14.

Related Articles

research ranking journals

  • Medical research
  • Institutions
  • Research data
  • Research management

First fetus-to-fetus transplant demonstrated in rats

First fetus-to-fetus transplant demonstrated in rats

News 30 APR 24

Bioengineered ‘mini-colons’ shed light on cancer progression

Bioengineered ‘mini-colons’ shed light on cancer progression

News & Views 24 APR 24

Mini-colon and brain ‘organoids’ shed light on cancer and other diseases

Mini-colon and brain ‘organoids’ shed light on cancer and other diseases

News 24 APR 24

Why doing science is difficult in India today

Why doing science is difficult in India today

World View 30 APR 24

Judge dismisses superconductivity physicist’s lawsuit against university

Judge dismisses superconductivity physicist’s lawsuit against university

News 25 APR 24

CERN’s impact goes way beyond tiny particles

CERN’s impact goes way beyond tiny particles

Spotlight 17 APR 24

How reliable is this research? Tool flags papers discussed on PubPeer

How reliable is this research? Tool flags papers discussed on PubPeer

News 29 APR 24

Algorithm ranks peer reviewers by reputation — but critics warn of bias

Algorithm ranks peer reviewers by reputation — but critics warn of bias

Nature Index 25 APR 24

Scientists urged to collect royalties from the ‘magic money tree’

Scientists urged to collect royalties from the ‘magic money tree’

Career Feature 25 APR 24

Postdoctoral Associate- Molecular Cell Biology

Houston, Texas (US)

Baylor College of Medicine (BCM)

research ranking journals

Associate or Senior Editor, Communications Medicine

Job Title: Associate or Senior Editor, Communications Medicine Location: Philadelphia, New York, Jersey City, Washington DC (Hybrid Working) Closin...

Jersey City, New Jersey

Springer Nature Ltd

research ranking journals

Associate or Senior Editor, BMC Psychology and BMC Psychiatry

Job Title: Associate Editor or Senior Editor, BMC Psychology and BMC Psychiatry Location(s): New York or Shanghai  Deadline: May 21st, 2024   You w...

New York (US)

W2 Professorship with tenure track to W3 in Animal Husbandry (f/m/d)

The Faculty of Agricultural Sciences at the University of Göttingen invites applications for a temporary professorship with civil servant status (g...

Göttingen (Stadt), Niedersachsen (DE)

Georg-August-Universität Göttingen

research ranking journals

W1 professorship for „Tissue Aspects of Immunity and Inflammation“

Kiel University (CAU) and the University of Lübeck (UzL) are striving to increase the proportion of qualified female scientists in research and tea...

University of Luebeck

research ranking journals

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

MBA Watch Logo

Ranking: Which Business Schools Are Best For Management Research?

  • Share on Facebook
  • Share on Twitter
  • Share on LinkedIn
  • Share on WhatsApp
  • Share on Reddit

research ranking journals

When it comes to the sheer volume of publications in top management journals, no other business school outpaced University of Pennsylvania’s The Wharton School in 2023.

Wharton faculty published 21 articles in management’s eight most prestigious schools, according to the TAMUGA Rankings, an annual index of research output compiled by Texas A&M University and the University of Georgia.

It shouldn’t be a big surprise. After Wharton’s fall from the The Financial Times ’ Global MBA ranking in 2023 , it was also bumped from the FT ’s research ranking. But the B-school, widely regarded as one of the best in the world, returned to the top of both ranks this year .

It’s also the largest business school in terms of faculty in the TAMUGA Ranking, tying with Harvard Business School at 46 tenure-track management faculty. That’s a leg up in a ranking that counts articles published.

One big caveat: If you look at the number of top-tier publications per faculty member, Wharton falls all the way to No. 19 with an average of 0.46 articles per management scholar.

HOW TAMUGA COUNTS ARTICLES

For its annual list, the TAMUGA Rankings count only publications in management’s eight top-tier journals: Academy of Management Journal, Academy of Management Review, Administrative Science Quarterly, Journal of Applied Psychology, Strategic Management Journal, Organization Science, Personnel Psychology, and Organizational Behavior and Human Decision Processes.

Counted articles must be published by management faculty within the calendar year – in this case January to December 2023. Articles with co-authors from multiple universities are counted once for each school, but articles with co-authors from the same institution are just counted once. So, an article co-authored by two University of Florida professors, one Texas A&M University professor, and one University of Georgia professor would count as “one” for Florida, “one” for Texas A&M, and “one” for Georgia, according to the ranking’s methodology page.

The full TAMUGA Ranking includes publication totals for United States Schools for each year from 2019 to 2023 as well as a five-year aggregate.

Wharton also tops the five-year aggregate with 85 total publications, but falls to 23rd on publications per faculty at a rate of 1.89 per faculty member.

University of Georgia’s Terry College of Business ranks second in the number of management articles published in the five-year period at 70, as well as second in articles per faculty at 5.09. That may well explain why the school helps to put together this list with Texas A&M which also, surprisingly, boasts a favorable ranking here. Terry, for instance, is followed by the  University of Florida and Texas A&M University with 68 articles each, and Arizona State University with 66.

WHARTON ALSO TOPS UT-DALLAS RANKING

Wharton also tops the UT-Dallas Top 100 Business School Research Rankings for 2024 with 392 articles published. Both that ranking and the Financial Times ’ research ranking consider more academic journals across more disciplines, but the TAMUGA Ranking counts only management faculty and just for one calendar year.

The FT ranking weighs the number of publications by a school’s current faculty in 50 academic and practitioner journals over a 30-month span while UT-Dallas’ looks instead at a school’s number of publications in 24 journals in a five-year span. Apples to apples comparisons, then, are not easy. However, Wharton, with its more than 20 research centers and initiatives, manages to top all three.

It has topped the TAMUGA Ranking for the last three years, rising from a three-way tie for seventh in 2019 with 12 published management articles (0.27 per scholar), to a three-way tie for third in 2020 (13 articles or 0.28 per scholar. In 2021, it tied for first with Texas A&M with 19 articles. However, in all those years, it has only broken the top 20 in the article per faculty metric.

You can see the top 10 finishers for both 2023 and 2022 in the chart below. Or click through the following pages to see the top 50 universities in management research over the last five years, in 2023, and in 2022.

You can see the full TAMUGA ranking here .

PAGE 2: TAMUGA Ranking five year aggregate for U.S. Schools

PAGE 3: TAMUGA Ranking 2023

PAGE 4: TAMUGA Ranking 2022

Questions about this article? Email us or leave a comment below.

  • Stay Informed. Sign Up! Login Logout Search for:
  • What Matters? And What More? 50 Successful Essays To The GSB & HBS
  • Specialized Masters Program Directory Business Analytics Hub MBA Admissions Consultant Directory Online MBA Hub Home Assess My MBA Odds
  • 12 Ranked U.S. EMBA Programs Under $115K In 2024 (11,613 views)
  • GRE Scores & Submission Rates At The Top 50 U.S. MBA Programs (10,369 views)
  • The Top 10 MBA Concentrations With The Best ROI (9,850 views)
  • Acceptance Rates & Yield At The Top 100 U.S. MBA Programs (5,461 views)
  • 10 Undergrad Business Schools To Watch In 2024 (5,299 views)

research ranking journals

Our Partner Sites: Poets&Quants for Execs | Poets&Quants for Undergrads | Tipping the Scales | We See Genius

  • Do Not Sell My Personal Info

Take Our Survey

  •  ⋅ 

Google Confirms Links Are Not That Important

Google's Gary Illyes says that Google needs very few links, more evidence that links matter less than at any other time in SEO history

Google confirms that links are not that important anymore

Google’s Gary Illyes confirmed at a recent search marketing conference that Google needs very few links, adding to the growing body of evidence that publishers need to focus on other factors. Gary tweeted confirmation that he indeed did say those words.

Background Of Links For Ranking

Links were discovered in the late 1990’s to be a good signal for search engines to use for validating how authoritative a website is and then Google discovered soon after that anchor text could be used to provide semantic signals about what a webpage was about.

One of the most important research papers was Authoritative Sources in a Hyperlinked Environment by Jon M. Kleinberg, published around 1998 (link to research paper at the end of the article). The main discovery of this research paper is that there is too many web pages and there was no objective way to filter search results for quality in order to rank web pages for a subjective idea of relevance.

The author of the research paper discovered that links could be used as an objective filter for authoritativeness.

Kleinberg wrote:

“To provide effective search methods under these conditions, one needs a way to filter, from among a huge collection of relevant pages, a small set of the most “authoritative” or ‘definitive’ ones.”

This is the most influential research paper on links because it kick-started more research on ways to use links beyond as an authority metric but as a subjective metric for relevance.

Objective is something factual. Subjective is something that’s closer to an opinion. The founders of Google discovered how to use the subjective opinions of the Internet as a relevance metric for what to rank in the search results.

What Larry Page and Sergey Brin discovered and shared in their research paper (The Anatomy of a Large-Scale Hypertextual Web Search Engine – link at end of this article) was that it was possible to harness the power of anchor text to determine the subjective opinion of relevance from actual humans. It was essentially crowdsourcing the opinions of millions of website expressed through the link structure between each webpage.

What Did Gary Illyes Say About Links In 2024?

At a recent search conference in Bulgaria, Google’s Gary Illyes made a comment about how Google doesn’t really need that many links and how Google has made links less important.

Patrick Stox tweeted about what he heard at the search conference:

” ‘We need very few links to rank pages… Over the years we’ve made links less important.’ @methode #serpconf2024″

Google’s Gary Illyes tweeted a confirmation of that statement:

“I shouldn’t have said that… I definitely shouldn’t have said that”

Why Links Matter Less

The initial state of anchor text when Google first used links for ranking purposes was absolutely non-spammy, which is why it was so useful. Hyperlinks were primarily used as a way to send traffic from one website to another website.

But by 2004 or 2005 Google was using statistical analysis to detect manipulated links, then around 2004 “powered-by” links in website footers stopped passing anchor text value, and by 2006 links close to the words “advertising” stopped passing link value, links from directories stopped passing ranking value and by 2012 Google deployed a massive link algorithm called Penguin that destroyed the rankings of likely millions of websites, many of which were using guest posting.

The link signal eventually became so bad that Google decided in 2019 to selectively use nofollow links for ranking purposes. Google’s Gary Illyes confirmed that the change to nofollow was made because of the link signal .

Google Explicitly Confirms That Links Matter Less

In 2023 Google’s Gary Illyes shared at a PubCon Austin that links were not even in the top 3 of ranking factors . Then in March 2024, coinciding with the March 2024 Core Algorithm Update , Google updated their spam policies documentation to downplay the importance of links for ranking purposes.

Google March 2024 Core Update: 4 Changes To Link Signal

The documentation previously said:

“Google uses links as an important factor in determining the relevancy of web pages.”

The update to the documentation that mentioned links was updated to remove the word important.

Links are not just listed as just another factor:

“Google uses links as a factor in determining the relevancy of web pages.”

At the beginning of April Google’s John Mueller advised that there are more useful SEO activities to engage on than links.

Mueller explained :

“There are more important things for websites nowadays, and over-focusing on links will often result in you wasting your time doing things that don’t make your website better overall”

Finally, Gary Illyes explicitly said that Google needs very few links to rank webpages and confirmed it.

I shouldn't have said that… I definitely shouldn't have said that — Gary 鯨理/경리 Illyes (so official, trust me) (@methode) April 19, 2024

Why Google Doesn’t Need Links

The reason why Google doesn’t need many links is likely because of the extent of AI and natural language undertanding that Google uses in their algorithms . Google must be highly confident in its algorithm to be able to explicitly say that they don’t need it.

Way back when Google implemented the nofollow into the algorithm there were many link builders who sold comment spam links who continued to lie that comment spam still worked. As someone who started link building at the very beginning of modern SEO (I was the moderator of the link building forum at the #1 SEO forum of that time), I can say with confidence that links have stopped playing much of a role in rankings beginning several years ago, which is why I stopped about five or six years ago.

Read the research papers

Authoritative Sources in a Hyperlinked Environment – Jon M. Kleinberg (PDF)

The Anatomy of a Large-Scale Hypertextual Web Search Engine

Featured Image by Shutterstock/RYO Alexandre

I have 25 years hands-on experience in SEO and have kept on  top of the evolution of search every step ...

Subscribe To Our Newsletter.

Conquer your day with daily search marketing news.

IMAGES

  1. Academic Journal Rankings Explained

    research ranking journals

  2. What’s Journal Quartile and Journal Ranking

    research ranking journals

  3. Which country leads the world in publishing scientific research

    research ranking journals

  4. Top 100 Journals in the World with Highest Impact Factor 2022

    research ranking journals

  5. Ranking of Journals with H-index

    research ranking journals

  6. Top 25 Prestigious Medical Journals To Publish In

    research ranking journals

VIDEO

  1. How to publish in high ranking journals

  2. Publication Tips & Tricks

  3. What is the best publication strategy: quality or quantity?

  4. journal 15: how to search Q1,Q2,Q3 and Q4 journals search

  5. Ranking of Academic Journals (Kurdish)

  6. Publish your research in open-access journals!🔥WiseUp #shorts

COMMENTS

  1. SJR : Scientific Journal Rankings

    International Scientific Journal & Country Ranking. SCImago Institutions Rankings SCImago Media Rankings SCImago Iber SCImago Research Centers Ranking SCImago Graphica Ediciones Profesionales de la Información

  2. Nature Index

    The Nature Index tracks the affiliations of high-quality scientific articles. Updated monthly, the Nature Index presents research outputs by institution and country. Use the Nature Index to ...

  3. Journal Rankings on Medicine

    International Scientific Journal & Country Ranking. SCImago Institutions Rankings SCImago Media Rankings SCImago Iber SCImago Research Centers Ranking SCImago Graphica Ediciones Profesionales de la Información

  4. SJR

    International Scientific Journal & Country Ranking. SCImago Institutions Rankings SCImago Media Rankings SCImago Iber SCImago Research Centers Ranking SCImago Graphica Ediciones Profesionales de la Información

  5. Web of Science Master Journal List

    Browse, search, and explore journals indexed in the Web of Science. The Master Journal List is an invaluable tool to help you to find the right journal for your needs across multiple indices hosted on the Web of Science platform. Spanning all disciplines and regions, Web of Science Core Collection is at the heart of the Web of Science platform. Curated with care by an expert team of in-house ...

  6. Journal ranking

    Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. ... CORE ranking, issued by the Computing Research ...

  7. 2022 tables: Institutions

    The metrics of Count and Share used to order Nature Index listings are based on an institution's or country's publication output in 82 natural-science journals through 2022, in 2023 64 health ...

  8. 2023 tables: Institutions

    The metrics of Count and Share used to order Nature Index listings are based on an institution's or country's publication output in 82 natural-science journals through 2022, in 2023 64 health ...

  9. Journal Citation Reports

    <link rel="stylesheet" href="/public/styles.29276a7c2f2290b7.css">

  10. Find top journals in a research field: a step-by-step guide

    Journal Citation Reports™: Discover quality research journals. Journal Citation Reports (JCR) is the most powerful product for journal intelligence. It provides researchers with a definitive list and guide to discover and select the most appropriate journals to read and publish findings. JCR delivers a rich array of publisher-independent data ...

  11. Clarivate Names World's Leading Journals with the 2022 Journal Citation

    The Journal Citation Reports contains more than 21,000 journals, from 254 research categories and 114 countries. This includes: 12,800 science journals; ... profile in the JCR provides a rich array of journal intelligence metrics and allows users to filter by category and rank. These include: The Journal Citation Indicator, which represents the ...

  12. What is there in the scoring and rating of journals?

    The most reputed and high ranked journals tend to proudly publicize their ranking scores in order to attract the best research and to remain in contention with their competing journals. On the other hand, the journals with low or no ranking tend to criticize the shortcoming and the value of the scoring systems, which is understandable.

  13. Academic Journal Rankings Explained

    As a response to this, various independent platforms and indexing databases have created academic journal rankings that compare journal statistics. These statistics include: Citation metrics. Relative standing of a journal in section areas. General community opinion on the quality of a journal's publications.

  14. Find a journal

    Elsevier Journal Finder helps you find journals that could be best suited for publishing your scientific article. Journal Finder uses smart search technology and field-of-research specific vocabularies to match your paper's abstract to scientific journals.

  15. Journal Citation Reports 2022: COVID-19 research continues to drive

    The Journal Citation Reports contains more than 21,000 journals, from 254 research categories and 114 countries. This includes: 12,800 science journals; ... profile in the JCR provides a rich array of journal intelligence metrics and allows users to filter by category and rank. These include: The Journal Citation Indicator, which represents the ...

  16. Find journals for your research

    Use this powerful journal intelligence to support your selection of the most appropriate journals for your research and publishing needs. ... (X.0), there is an increase of rank position ties in many categories. These changes only impact the 2022 metrics, to be released in June 2023, and will not impact earlier JIFs, rank and quartiles. arrow ...

  17. Ranking the Ranker: How to Evaluate Institutions, Researchers, Journals

    To avoid subjectivity in ranking journals and conferences, one can use quantitative measures based on citations. ... Werron T (2021) Worlds of rankings. In: Ringel L et al (eds) Worlds of rankings (research in the sociology of organizations). Bingley, Emerald. Google Scholar Schmid SL (2018) Five years post-DORA: promoting best practices for ...

  18. Best Computer Science Journals Ranking

    The ranking contains Impact Score values gathered on December 21st, 2022. The process for ranking journals involves examining more than 6,652 journals which were selected after detailed inspection and rigorous examination of over 99,245 scientific documents published during the last three years by 10,278 leading and well-respected scientists in the area of computer science.

  19. 50 Journals used in FT Research Rank

    The list below details the 50 journals used by the Financial Times in compiling the FT Research rank, included in the Global MBA, EMBA and Online MBA rankings. 1. Academy of Management Journal. 2 ...

  20. Best Medicine Journals Ranking

    The ranking of best journals for Medicine was published by Research.com, one of the prominent websites for computer science research providing trusted data on scientific contributions since 2014. ... The process for ranking journals involves examining more than 10,800 journals which were selected after detailed inspection and rigorous ...

  21. Best Engineering and Technology Journals Ranking

    The ranking contains Impact Score values gathered on December 21st, 2022. The process for ranking journals involves examining more than 8,315 journals which were selected after detailed inspection and rigorous examination of over 116,368 scientific documents published during the last three years by 8,021 leading and well-respected scientists in the area of computer science.

  22. Finding the Right Journal in Which to Publish Your Research

    Sometimes it is difficult to decide on the best journal match for your manuscript. Aside from your own skills at researching journals, there is now software available to help you look at ways to find a good fit for your article or paper. On this page of the Impact and Metrics Guide, you will find links to some of the latest journal finder tools ...

  23. Ranking Journals: Academic Journal Guide 2021 ("ABS List")

    The following OSCM journals were given a 4 in the 2021 ABS list: European Journal of Operational Research, International Journal of Operations & Production Management, Journal of Supply Chain Management and Production & Operations Management. This is good news for our discipline, because it means that the Journal of Supply Chain Management has ...

  24. AIS Research Rankings

    Welcome to the IS Research Rankings Website! This Software Service tracks publications in 8 leading IS journals in the AIS Senior Scholars' basket of journals. At present, the database includes data for the years 1990-2023. This Software Service is provided for free for 2024. Please read the end-user licensing agreement for details governing ...

  25. Nature Index 2023 annual tables

    2023 tables. The Nature Index 2023 Annual Tables reveal the leading institutions in natural sciences and Health Sciences research, according to their output in selected journals. Also featured are ...

  26. Nature Index Annual Tables 2023: first health-science ranking reveals

    With roughly 18 million people, it has an outsized impact, ranking eighth in the Nature Index Annual Tables 2023, with a Share of 358 — above Japan and Italy. Dutch institutional spending on ...

  27. Are There Meaningful Prestige Metrics of Kinesiology-Related Journals

    The prestige measures were highly skewed and had large variability, with medical journals having the highest values. There were strong nonlinear and heteroscedastic associations between all four prestige indicators, however rankings of top 10% journals based on these metrics had considerable disagreement between metrics.

  28. Ranking: Which Business Schools Are Best For Management Research?

    When it comes to the sheer volume of publications in top management journals, no other business school outpaced University of Pennsylvania's The Wharton School in 2023.. Wharton faculty published 21 articles in management's eight most prestigious schools, according to the TAMUGA Rankings, an annual index of research output compiled by Texas A&M University and the University of Georgia.

  29. Google Confirms Links Are Not That Important

    One of the most important research papers was Authoritative Sources in a Hyperlinked Environment by Jon M. Kleinberg, published around 1998 (link to research paper at the end of the article). The ...

  30. Philadelphia among nation's elite life sciences hubs in new rankings

    The report found the Philadelphia region is "on the forefront" in gene and cell therapy and mRNA research and development, with strong real estate availability and demand for life sciences space.