F1000 and ORCID Partner to Launch Standard for Citing Peer Review Activities

Laure Haak's picture

Peer review is an essential component of the research and scholarly lifecycle.  And yet, researcher peer review activities are rarely acknowledged as a fundamental contribution to research.  At the same time, the need to encourage more researchers to participate in the review process to deal with an ever-increasing number of manuscripts and grants, is bumping into a growing desire to find ways to build more trust into peer review processes.  The time is right to think creatively about how to build incentives (including public acknowledgement) and trust (through validation).

The practice of acknowledging peer review

With this in mind, ORCID and F1000 partnered to develop a practical means to cite peer review activities, based on a standard set of terms and the use of persistent identifiers. “The formal but completely transparent peer review process post-publication used on both the F1000Prime article recommendation service and on articles published in F1000Research provides us with the opportunity to maximize the credit we can give back to our referees for the often significant time and effort required to produce a good peer review - work that typically remains invisible”, says Rebecca Lawrence, Managing Director of F1000Research.  “Enabling referees to both be able to cite their often lengthy referee reports, as well as include them on lists of their outputs such as ORCID records, is a major step forward in properly recognizing this activity that is essential for the advancement of scientific understanding.”

Working with the Consortia Advancing Standards in Research Administration Information (CASRAI), we established a community working group to define a standard field set and business rules that would work across the many types of peer review used in publishing, funding, university research management, and conference presentations.  Across formal review types, we found that the main challenge was how to define a standard that is flexible enough to enable the full range of review, from anonymous double-blind processes to fully open review. As a group, we concluded that, for anonymous reviews, the entity requesting and managing reviews should provide an aggregate count of reviews for that reviewer over an appropriate time period.

Working Group recommendations

We are pleased to announce the publication of the Working Group recommendations for a peer review activity data profile.  The Working Group concluded that the following citation data elements should be used to describe peer review:

  • PERSON: Fields, including a person identifier, describing the person who performed the review and is being recognized for this review activity.
  • REVIEW: Fields, including a review identifier, describing the review itself. In the case of blind or otherwise unshared reviews, this information may be left blank.
  • SUBJECT: Fields describing the subject of the review, such as the paper, grant or other item. In the case of blind or otherwise unshared reviews, this information also may be left blank.
  • ORGANIZATION: Fields, including an organization identifier, describing the organization that is recognizing the person for review activity, such as a publisher, association, or funder.

To more clearly articulate the contributions, and to better fit with typical peer review workflows, the Working Group recommended that each review contribution be described independently. For example, if a person performed four reviews for a particular journal, four review descriptions would be needed to describe the activity. However, to allow for anonymity, which is sometimes needed for peer review activity, aggregation of activities and minimal metadata are acceptable, with only the person, the organization/resource that sponsored the review, the count of reviews over a period of time, and a unique source identifier such as a front matter page listing reviewers, required to recognize the contribution.

Implementing recommendations

Several organizations plan to adopt this peer review citation standard when collecting, storing, and exchanging peer review information. In April 2015, the ORCID technical team started the process of interpreting and implementing the working group recommendations, and we will be launching support for peer review in our APIs and user interface this summer.   Here is what some other organizations are planning:

  • F1000 are planning to implement the new review citation data model to post validated review activity to reviewer ORCID records via their platform, and work with their partners to facilitate integration into conference review workflows.
  • American Geophysical Union, working with eJournalPress, are planning to implement a workflow to collect ORCID iDs for their reviewers and post validated review activity to reviewer ORCID records using the new data standard.
  • The APSA’s Politics and Religion journal is planning to collect ORCID iDs for its reviewers.
  • Peerage of Science plans to integrate ORCID and the review citation standard into its new Reviewer Profiles feature.
  • Europe PubMedCentral plans to display reviewer information and links to open reviews.
  • Kudos are planning to work with ORCID to test workflows and APIs for displaying peer review citations in a third party impact-management platform.

We call on all organisations that use peer review --publishers, funders, associations, universities-- to enable their referees to get proper credit for the work they do.  Help us to spread the word!


Co-authored by Rebecca Lawrence, Managing Director, F1000 Research.