If you are committed to open science, there are fairly limited options for reviewing papers. Basically your only option is to review for journals which support and enforce open science principles. You are then trusting the journal to make sure the author makes their data and code available, and also publishes the article open access. Moreover, you are forced to decline to review interesting papers because they are not submitted to a limited number of journals supporting open science, or because the authors couldn’t afford the article processing charge and submitted to a closed-access journal. This approach provides no feedback mechanisms to the authors of these papers that you would have been happy to review their paper had they committed to making their article, code and data openly accessible to the community. It provides no incentive to authors to be more open.
We wanted to create an alternative, and this is where the idea of open-science agreements came from. When you are invited to review a paper, you can visit http://academickarma.org/reviewagreement to specify (anonymously) what you expect from the authors in terms of openness before you agree to review their paper. You also specify how open you plan to make your review, and how long you expect it will take you to complete the review. Rather than asking the authors to pay to publish open-access, the reviewer can ask that both the submitted and revised manuscripts are uploaded to a preprint server. The authors can suggest modifications and explain if they are unable to satisfy all of the reviewers expectations. If an agreement is reached , then the reviewer can review the paper with the knowledge that their free labour is supporting resources (code, data, manuscript) which can be re-used by everyone.
More information on how open science agreements work is provided here.
In preparation for open access week, we are asking for feedback on our new initiative: Reviewer-Author contracts.
Academics review manuscripts for free in order for publishing companies to make billions by charging readers to the access the work. We think that a viable alternative is for scientists to only agree to review manuscripts which are first deposited in preprint servers, and to make the content of their review openly available alongside the preprint. Reviewer agreements are a way to give the reviewer a bit more influence over the openness (open access, open code, open data) of papers they choose to review, or at least a way for them to chose to only review papers which are following open practices.
How does it work:
1. Reviewer receives an invitation from a journal to review a paper.
2. Invited reviewer fills in the form at http://academickarma.org/reviewagreement, specifying their conditions for agreeing to review the paper.
3. Academic Karma sends an email to the paper author informing them of the conditions requested by the anonymous invited reviewer.
4. The author can either agree, decline, or modify the agreement.
5. If the agreement is modified, an email sent to the invited reviewer, who can either agree, decline, or modify the agreement
6. 3-5 repeated until an agreement reached, or either invited reviewer or author declines.
7. Once an agreement is reached, the reviewer agrees with the journal to review the paper.
8. The author posts the preprint.
9. The reviewer reviews posts as a comment on Biorxiv preprint page, or if they want to remain anonymous, they post the review on the Academic Karma review page and we post their comment for them (possibly after a specified ’embargo’ to give the authors a chance to respond first).
10. The review is sent to the journal editor.
11. The author modifies the paper, re-uploads to a preprint server, and posts their response to the review
What is currently included in the review agreement:
1. Option to ask author to post preprint.
2. Option to ask author to agree to post a revised preprint.
3. Option to ask author to make data openly available.
4. Option to ask author to make source code openly available.
So that the expectations are not just on the author’s side, the reviewer can also choose to commit to
1. The maximum time they will take for the review.
2. Whether they will agree to review a revised manuscript.
3. Whether they would be willing for their review to be transferred to another journal.
4. How long of an embargo period before the content of their review is posted.
These ‘contracts’ are not binding in any legal sense of course. However, a permanent record is kept so that both reviewer and author can refer to what they agreed to.
We welcome any feedback on what extra conditions we might want to include, as well as what extra optional commitments reviewers might like to make.
Peer review is controlled by publishers who generate billions in profits from free labour. What are your options if you do not want to provide free labour to be exploited by publishers, but still want to contribute to peer review?
- Demand payment for peer review either to yourself, or for your department, or perhaps a contribution to a worthy cause
- Refuse to review for certain publishers
- Open review preprints
The problem with 1. is that publishers will pass that cost onto authors. A good example of Option 2 is The Cost of Knowledge boycott of Elsevier.
What about option 3?
The idea is that you only agree to review manuscripts if they are also posted as preprints, and that you publish your reviews openly online and send the link to the journal editor. The authors can respond to your reviews openly, independently of the review process in any particular journal. The journal can still use your reviews of course, but then again so can anyone else. Instead of doing free work for a publisher, you have instead contributed a common good from which everyone can benefit .
There is already a group of pioneers in this space who are already posting open preprint peer review. These scientists have between them written 61 open reviews of 51 preprints. Undoubtedly there are many more examples (e.g. in blog posts) which have not collected here – please point us to these so that we can index them.
Not everyone is comfortable posting non-anonymous open peer review. We have created a platform where you can post content-open preprint peer review anonymously or non-anonymously. You can review any arXiv, bioRxiv, PeerJ, SSRN preprint, or even papers deposited in several institutional repositories. If you are worried your review is overly critical and might be damaging to the authors, you can also set an ’embargo’ period to give the authors a chance to respond before the review is made open.
So here’s to those scientists who are showing us that there is a way to contribute to peer review without providing free labour to be exploited by publishers.
One of the goals of Academic Karma is to separate peer-review from publication in a particular journal. One of the most promising ways of doing this is by the community shifting to content-open review of preprints.
In order to try to encourage this, we are putting up a $200USD prize for the most endorsed review of a preprint presented at SMBE16.
The competition will run until the end of July.
Recently we tweeted the reasons for peer-reviewing but not for journals:
Its important to clarify what we mean by ‘for’ here: we use it to mean as the primary purpose of the review. So by all means share a content-open preprint with a journal considering publishing the article. Also, by content-open we mean making the content of the review open, but potentially choosing to remain anonymous. Junior researchers justifiably worry about implications of non-anonymously criticising senior researchers in their field.
One way to peer-review, but not primarily ‘for’ journals is to review a preprint which was presented at a conference you attend. The advantages are that you get the chance to hear the author explain the work in person, and you can also ask them to clarify anything which was unclear in the preprint.
In order to make this easier, we have implemented a feature which lists preprints presented at selected conferences. If you would like this feature enabled for an upcoming conference, you can register it here http://academickarma.org/conferences, and let us know so we can help you curate the list of preprints.
One great example of this is the Society for Molecular Biology and Evolution conference (SMBE16): http://academickarma.org/conference/SMBE16.
In order to try to encourage more ‘conference preprint’ peer-review, we are going to award $200USD to the most endorsed review of a preprint presented at SMBE16. The rest of this post is a primer on how to use Academic Karma to review a SMBE16 preprint. In order to review a preprint you dont need to register specifically for Academic Karma, but you do need an ORCID identifier (http://orcid.org).
Step 1: Choose a preprint to review here: http://academickarma.org/conference/SMBE16.
Step 2: Click on the ‘Review’ link, which takes you to the review page for that particular paper. For illustrative purposes, we are going to do a review of : A profile-based method for identifying functional divergence of orthologous genes in bacterial genomes.
Step 3: On the review page, click on the ‘Review this paper’ link
Step 4: This takes you to a login-page for logging into Academic Karma via your ORCID. Once you have logged in you will be automatically redirected to the review page.
Step 5: Write and submit the review. You can specify who contributed to the review, whether you would like the comments sent via email directly to the author, if you would like to have the reviewed cc’d (e.g. to an editor), if you would like to sign the review, and an ’embargo period’ before the comments become publicly visible (allowing the authors to respond to the review before it goes live).
Step 6: Thats it! Maybe tweet the review if you would like and ask people if they would like to endorse it.
To endorse a review, just click on the endorse button (you need to be logged in, but its easy to log in using your ORCID or twitter credentials.
All completed reviews are listed alongside the preprints here: http://academickarma.org/conference/SMBE16.
One of the major goals of Academic Karma is to make open-access publishing both cheaper and faster.
One of the reasons open-access publishing remains expensive is the cost of coordinating peer review. This is estimated to make up at least half the cost of publishing in a pure open-access journal. Peer review is also one of the main reasons it takes so long between submitting and publishing a manuscript (https://quantixed.wordpress.com/2015/03/16/waiting-to-happen-ii-publication-lag-times/). Peer review is slow because its difficult to find reviewers (we are all too busy working on our own manuscripts, which makes sense as that is what we are rewarded for) and its difficult to incentivise reviewers to spend time promptly on the review rather than their own work.
Scientific reports, a Nature Publishing Group open-access journal, also recognised that slow peer review is a problem, but they proposed to solve this by providing fast-track peer review with a commercial partner for a fee of $750, of which $100 goes to each reviewer (presumably either two or three reviewers) and the remainder is taken as an extra processing charge by the journal and its commercial partner.
In contrast, the whole point of Academic Karma is that diligent, timely reviewers should be able to access fast-track peer review on their own papers. So in a sense we also want to introduce a two-speed review process, except that ours rewards Academics who are contributing to speeding up peer review. The logic is that this creates an incentive for every scientist to be a regular and diligent peer reviewer, which speeds up the system for everybody. In the long-run this creates a self-sustaining system which makes it much easier to find Academics eager to do ‘enough’ peer review so that their own papers will be reviewed in a reasonable time-frame.
Of course, timeliness is not the only factor in peer review. The quality of the review is also very important, and for that reason we are working on ways in which reviewers, editors and authors can evaluate the quality of the review. Ultimately we are trying to build a system which speeds up, improves the quality and lowers the cost of peer-review.