ANALYSIS | 2019-09-17

Harnessing Peer Review Data for Journal-Level Statistics

Publons
The peer-review registration platform Publons could potentially generate freely available, machine-readable journal-level information as regards the number of submissions, acceptance and rejection rates, turnaround times, number of revisions, and the demographic background of authors, editors and referees: A modest vision for Peer Review Week 2019.

One can occasionally find journal-level statistics that may affect where researchers submit their manuscripts. For instance, the Journal of Politics recently tweeted that they had received 1251 submissions in 2018. Some other journals publish annual reports, such as International Studies Quarterly which exhibited an acceptance rate of 3.1% in 2016 (according to its latest available report), and at American Journal of Political Science, the average turnaround time (i.e. the period between the first receipt of a submission until the first decision) for its 1035 submissions in 2018 amounted to 55 days (cf. the AJPS 2019 Report). These numbers tempt one to look for comparable data from other journals – only to find that such journal-level statistics are generally unavailable. It is only due to the editors' or the journal owners' voluntary service that we sometimes obtain a glance in journal-level data; but in most cases, one will not find any reliable information on submissions, acceptance rates, turnaround times or demographic data about anonymous reviewers.

There may be ways out, though some possible solutions are less efficient than others. Platforms relying on crowdsourced data from authors who may voluntarily contribute their journal-submission experiences, such as SciRev, do not seem to gain traction. For instance, one can merely find 6 reports regarding the Journal of Politics on SciRev, despite the journal having had 1251 submissions last year. Another possibility is to solicit such data from the journal editors themselves, as was done in a blog called Review my Review some years ago. This is noble, but it is likely to suffer from a lack of resources and validity – in fact, the blog Review my Review, which contained journal-level data for dozens of political science journals, has been offline for a while now, and there is no sign that it will be resuscitated.

But there is hope. There is one existing platform which could indirectly offer a wealth of data for any scientific journal out there – namely Publons (owned by Clarivate Analytics / Web of Science).

Publons' core activity is to turn peer reviews into measurable units. It not only helps scholars getting recognition for their anonymous feedbacks, but also enables journal editors to recruit from a huge reviewer pool. In other words, Publons aims “to speed up science by harnessing the power of peer review”.

While for most journals Publons still relies on crowdsourced data from referees who voluntarily send in their reviews, there is a growing number of publishers and journals that establish “official partnerships” with Publons. This partnership means that the journals automatically register peer-reviews to the platform. Last time I counted (more than a year ago), there were roughly 50 (out of ca 225 SSCI-indexed) political science journals that had established such official partnerships. Among them are outlets as prominent as the American Political Science Review, Journal of Peace Research, or JCMS: Journal of Common Market Studies. Together, these official partners have passed on thousands of reviews to Publons; for example, Publons has documented 1728 reviews for the American Political Science Review, as of September 2019.

As of now, Publons merely conveys the number of registered peer-reviews per journal and some other statistics (such as the average word count of a referee report in a given discipline). However, one may think of the possibility that Publons combines their data in such a way that one could automatically infer for every single journal the number of submissions, the rejection or acceptance rates, the turnaround times, the number of revision rounds, and demographic information about authors and reviewers and editors and many other valuable journal metadata. Of course, such data should be anonymized and aggregated so as to ensure data protection and not to violate the blindness of the peer-review procedure.

To list potential examples, through a combination with personal information stored in Web of Science's ResearcherID, Publons could open up statistics about the gender composition of a journal's or a discipline's reviewer pool. By linking it to other platforms, such as ORCiD, they could also calculate the share of early career researchers involved in peer-reviewing. As Publons gets information on the papers reviewed, it could also open up data about the median number of revision rounds, or frequently occurring journal trajection routes after rejection. All these data could tell a lot about scientific practices.

In order to enable the efficient use of such data (such as for research purposes), these journal information should be provided as Open Data, i.e. as freely available, machine-readable metadata. Publons provides an API, but this API currently conveys only limited information. For instance, the partnership status of a journal or the number of registered peer-reviews per journals cannot be fetched from that API (yet). These data have to be manually collected, which is too laborious. The lack of such open data from Publons is a major weakness, but one that can be easily enhanced.

This is just a personal interpretation from someone not involved at Publons; it is possible that I am wrong about the potentials.

Anyway, to sum up, OOIR's wish (or vision?) for 2019's Peer Review Week is that Publons combines and opens up their data to generate freely available, machine-readable journal-level information as regards the number of submissions, acceptance or rejection rates, turnaround times, number of revision rounds and the demographic background of authors, editors and referees.


Did we miss something, or do you have any comments?
Let us know or write a comment below!

See also ...
OOIR | 2019-09-04
Publishers of academic papers can transform bibliographic references into machine-readable metadata. Such “Open Citations” can, for instance, indicate interlinkages between journals. However, so far only 59% of all journals tracked at OOIR have made their citation data freely accessible.
Research Highlights | 2019-09-13
How many papers were published across the social sciences in August 2019? Which journals were most active, and which papers were trending in each discipline?