Ask us anything: Some leading ecology journals rank surprisingly low in impact factor. What does that say about ecology as a scientific field?

Recently, we invited you to ask us anything! Today’s question comes from Falko Buschke (click that link for the full question, which I’m about to summarize).

Falko notes that, in the latest Clarivate journal citation reports, several leading ecology journals rank surprisingly low in impact factor (IF) among all 195 journals in the Clarivate “ecology” category. For instance, Journal of Ecology is only 20th in IF. Ecology is 32nd. Journal of Animal Ecology is 48th. Oikos is 56th. American Naturalist is 82nd. The question is, what, if anything, do the comparatively low IF ranks of these leading journals tell us about ecology as a field?

Falko took a few stabs at answering his own question. Maybe the low IF rankings mean these journals aren’t chasing trends:

Personally, I think the silver lining is that these journals are probably not jumping on publishing bandwagons, which is a sign that their editorial standards have been consistent. I reckon this is a good thing in the long-run.

On the other hand, maybe “not chasing trends” is just another way of saying “catering to a niche audience”:

But an alternative explanation could be that that the type of research in these journals is inaccessible or uninteresting to most researchers and lacks broad appeal (i.e. these journals have become echo chambers for like-minded thinkers).

On the third hand, maybe “not chasing trends” is just another way of saying “continuing to cater to American, British, and European authors and readers, while the field of ecology globalizes”:

A third possibility is that it is a sign of globalisation in ecology, where the centre of focus has moved away from the traditional strongholds in USA/UK/Europe.

Falko’s answers to his own question are so interesting, I was tempted to just let them stand and call it a day. 🙂 But that would be against the spirit of an AUA, so here are our answers:

Jeremy’s answer:

First, two minor caveats: Caveat #1: Impact factors are arithmetic means of highly right-skewed data. Most papers in any given journal draw few citations, while a few papers draw many. So a journal’s two-year IF, and IF rank, can bounce around a fair bit from year to year, as an occasional high-impact paper enters or exits the sample.* That’s why, if you’d compiled the same data a few years ago, you’d have found Ecology Letters ranked #1, if memory serves. Caveat #2: Three of the highly ranked journals in the Clarivate “ecology” category aren’t really ecology journals (Landscape and Urban Planning at #7, Ecological Economics at #10, International Journal of Sustainable Development at #11). Ok, the boundaries of “ecology” are fuzzy. And yes, there certainly are connections between ecology, urban planning, sustainable development, and economics. Closer connections than between ecology and, say, materials science or quantum mechanics. But you have to draw some boundaries if a scholarly field is to have enough coherence to be worthy of having a name. I’m fine with drawing a line between ecology, and urban planning or sustainable development or economics.

But those are minor caveats that don’t really undermine the premise of your question, Falko. Your premise is correct. It is in fact the case that the journals that I, and many of my friends in ecology, think of as among the “leading” journals in the field are in fact never at the very top of the IF rankings in ecology, or even particularly close. I think you’re right that that tells us something about the field of ecology. But what?

To fully answer that question, I’d think you’d want to go back through many years’ worth of the annual IF rankings. And also do a citation network analysis. Which would be work, so I’m not going to do it. (But Brian did! See below.) Anyway, just looking at the current IF ranking list for ecology, several answers jump out at me:

Ecologists love to cite review papers and methods papers. The #1, #3, #13, #16, and #18 ecology journals by two-year IF are review journals or methods journals.

One reason some leading US/British/European ecology journals aren’t all that high on the IF ranking list is…other leading US/British/European ecology journals. Nature Ecology and Evolution is #2 on the list, Ecological Monographs is #9, and Functional Ecology is #28.

Ecology as a field definitely has become more global and more applied since the early ’90s. One symptom of those shifts was the founding of new ecology journals focused on global scale data (especially biogeography), global climate change, and applied issues. Many (though far from all!) papers in those topic areas, and citations to papers in those topic areas, now go to journals that specialize on those topics, rather than to general ecology journals like Ecology. Global Change Biology is #5, Global Ecology and Biogeography is #12, Ecosystem Services is #13, Agriculture, Ecosystems, and Environment is #15, Ecography is #19, Conservation Biology is #21, Journal of Applied Ecology is #24, Biological Conservation is #25, Diversity and Distributions is #27, Ecological Applications is #34. Notice in particular that some of those global ecology and applied ecology journals are from the same scientific societies that publish leading general ecology journals like Ecology, JEcol, JAE, and Oikos. Those scientific societies did jump on a publishing “bandwagon” (really, a long-term shift in the direction of the entire field of ecology). But they did it by founding new journals, rather than by expanding the scope of their existing journals.

Another big change in ecology since the early ’90s is the rise of cheap gene sequencing and other genetic and molecular techniques. That made it a lot more feasible to study microbes. Hence the rise of microbial ecology, and of journals devoted to microbial ecology. (ISME Journal at #4, ISME Communications at #22). And the associated rise of journals devoted to molecular techniques, and to research based on molecular techniques (Molecular Ecology Resources, #18; Molecular Ecology, #30).

Chinese science is on the rise. Acta Ecologica Sinica (recently renamed Ecological Frontiers) is #26 on the list.

I definitely would not characterize any journals published by the ESA, BES, or Nordic Society Oikos as “echo chambers for like-minded thinkers”! Any more than I would characterize (say) Global Change Biology as an echo chamber for people who all think about global change.

I do think there are some journal-specific factors at work in some cases, but I’m not going to comment on them.

*Although IF doesn’t bounce around as much as I thought when I typed that sentence; see Brian’s answer below.

Brian’s answer:

Like Jeremy said, Impact Factor (IF) is driven mostly by the 3-10 most cited papers in a journal. Even a journal like Science has a median citation rate that is pretty pedestrian. It’s those papers with 100s of citations in two years (for Science, or 30-100 for more pedestrian journals) that drive the whole IF. Second I’ve never seen a perfect study, but as best I can tell the standard error bars around IF should be close to +/- 1.0 IF. That is pretty big. And when you look at rankings, changes of even 0.1 or 0.2 IF can make a big difference in ranking, so rankings are even noisier.

Second IF is only comparable WITHIN a field. As Jeremy said what counts as a meaningful comparison for a field is subjective. Just as an example, the top marine sciences journal (MEPS or Marine Ecology Progress Series has an IF=2.2 and a ranking of 93. Is that partly because marine scientists put papers with more general implications in general ecology journals? Yes. But comparable terrestrial journals that don’t get a lot of really general ecology papers have much higher IF. That is of course is not a signal of the quality or importance of a field, but rather the number of scientists publishing in the field. There are more terrestrial ecologists than marine ecologists. It is also a function of how fast the field turns around; the ISI IF counts only two years of impact – if you get cited a ton 3 years after you publish it doesn’t show in the ISI IF which has only a two year window. So (plausibly) if a really cool paper in MEPS inspires a lot of new experiments but they all take several years to get launched due to limits on ship time, etc, marine science will have a lower IF just due to turnaround time. Freshwater ecology has an even lower citation rate than marine, and paleoecology also is lower. Biomedical is much higher both because of number of scientists and speed of turnaround. So it’s really vital to only compare within a field.

When I was an Editor-In-Chief at Global Ecology and Biogeography I used to watch IF pretty closely. The June release was a bit like Oscars night for journal editors. As a journal editor I was torn by this. I regularly told the AEs to ignore IF and focus on the quality of the paper and the quality of the process. At the same time, I knew I would be having a pointed conversation with the managing editor who would definitely be discussing IF. To give Wiley credit, I don’t think they over managed for IF, but it was always discussed. But, worse, both they and I paid attention to IF because we knew authors were also looking closely at IF and choosing where to submit based on IF. So I felt an obligation to at least keep the status quo. As a result, despite telling AEs to ignore IF, I was always thinking about ways to game the system (mostly it consists of trying to get more review and methods papers, which I did, and holding likely hot papers to January of a new year so they had maximum time to accumulate citations in the 2 year window and pushing authors to cite papers in the same journal released in the last two years, both of which I did not do, but other EIC have done). But if you look at GEB, the trend in IF is pretty close to flat (bouncing around between 6.5 and 7.0). I think chasing IF per se is a pipe dream.

With those qualifications, what can we say specifically about Falko’s question about trends? Here is some hard data. I pulled down ISI IF for the ecology field for 2008, 2011, 2014, 2017, 2020, 2023 (the most recent year which was just released in summer 2024). I took the top 50 journals in 2023 and plotted trendlines for IF and ranking. I plotted trendlines for the ISI IF and the ISI rank (within ecology journals). The vertical scale for IF is 0-20 and the vertical scale for Ranking is 1-50 (50 being highest so that an upward trend is good). I’ve only shown sparkline trends as I don’t want to start treading into the waters of republishing ISI data. But I think it is enough for the story. Each red dot is a year (coming 3 years apart).

The types of society journals Falko mentioned are highlighted in grey. The two society journals that are of the type Falko mentioned that didn’t make the top 50 are AmNat (went from IF=4.67, rank=14 in 2008 to IF=2.4, rank=82 in 2023) and Oikos (went from IF=2.97, rank=33 in 2008 to IF=3.1, rank=61 in 2023)

So what does this tell us?

Most journals are pretty steady in IF. In fact clear long term trends are the exception rather than the rule. On the other hand, the rankings can magnify very small changes in IF. I think constant IF, constant or dropping ranking mostly describes the society journals Falko mentioned. They are highlighted in light grey in the figure. Really, it’s actually pretty astonishing how IF of journals that have been around for more than 10 years or so have an almost constant IF.

If you look at who is at the top of the list, it is always going to be review journals and methods journals (TREE, AREES, FEE, ECOL LETT, METHODS). And after that a lot of the other top hitters are by subject area (remember comparing IF between subject area is a fools game). So microbial ecology (ISME) and global change ecology (GCB) rank high. Journals that tap into multiple fields like geography, environmental science, climate, etc all tend to get more citations too because of the size of the pool (e.g. GEB, Ecosystem Services, ISME, Ecography). All of these have always ranked higher than journals like Ecology or J. Animal Ecology or Oikos. This is just the nature of IF and not a sign of anything bad about those journals. In honesty, if you take out the new journals, the 2023 list looks an awful lot like the 2008 list, which brings me to …

The other big change is the new journals (most of the ones that don’t have six dots and a blue line all the way across). There were 124 journals in the Ecology category in 2008 and 195 in 2023; that is more than a 50% increase in 16 years! Nature Ecology and Evolution is the most obvious addition. I spoke about brand extensions elsewhere. I personally believe it is primarily responsible for the decline in Ecology Letters, although I cannot prove it. The next new ones are not really new – Landscape Urban Planning and Ecological Economics are both decades old and were classified in Ecology in 2008 but were not in the top 50 in 2008. So it’s a shortcoming of my methodology that they appear new. What really happened is they are the rare journals with a strong upward trend in IF that pulled them into my top 50 analysis. Most of the others that appear new since 2008 are. And they managed to pull off the feat of launching new at a high IF level. They launched at high IF through a variety of strategies (again brand extension for NEE, highly cited field for Ecosystem Services, review/opinion oriented format for Current Opinions in Insect Science. methods oriented format for Ecological Informatics and Molecular Ecology Resources). You can find a geography story in Acta Sinologica, but that’s actually pretty rare.

It is also hard not to notice that applied ecology journals (ecosystem services, global change, ecology of agriculture, human dimensions like economics and geography) are moving up, both in terms of raw number of journals and explain most of the cases of a strong upward trend in of citation/IF.

TL;DR: So, to summarize and directly answer’s Falko’s question:

I think in general society journals are doing just fine. Their IF is fairly flat (as are most other journals – this is not a case of flat means being left behind). Some like Oikos and Ecology seem to have dropped a lot in rank, but that is mostly a function of rank being hypersensitive and of new journals being created popping in ahead. I wouldn’t worry about shifts in rank.

I don’t think there is much evidence of geographic shift in journals (although there is a a profound geographic shift in authors. Authors are too busy chasing IF in their choice of journals to cause a geographic shift in journals. As long as authors chase IF, the establishment journals will remain the establishment.

I do think editors should focus on quality of process and quality of outcome and not chase IF. It barely works to chase IF anyway (unless you start to reach into unethical territory). I think most editors and journals are actually focusing on quality and process and probably making a few efforts here and there to increase IF, but not really having much effect.

Readers opinions about the growth of applied journals will vary. But its hard to imagine it will go away.

If I were to advise a society about its journals, I would not worry if IF was flat (even if this resulted in a drop in rank) for existing journals. Instead, I would be looking to launch new journals in applied areas (or in methods/opinion/review format, or trying to guess the next big trend like microbial ecology), rather than worrying about the existing journals. Which is mostly what is happening (e.g. BES launch of People and Society or ESA launch of Earth Stewardship and 20 years ago launch of Frontiers in Ecology and Environment).

Hot Topics

Related Articles