Experimental Results: the challenge of implementing our first open peer review process

Experimental Results is a truly innovative project for Cambridge University Press – it is the first of our journals to have an open peer review process, as well as being designed to address the positive bias and replication issues often seen with traditional journals, by providing a forum for all sound experimental findings across Science, Technology & Medicine. Reviews (in a scorecard format) for each article are published online within the article PDFs and are also presented as charts in a separate peer review tab. To align with the transparent nature of Experimental Results, we would like to be open about the challenges we came across in developing the display of the Cambridge Core website to enable this to be possible.

The process began more than eighteen months ago with the preparation of the concept document and business case for Experimental Results, which laid out the innovative features and development work that would be key to the project. For open peer review, there were a number of development points to be taken into account, such as how we wanted the reviews to appear online and how to enable this, as well as ensuring that the reviews were published in an ethical manner, including clear reviewer consent and conflict of interest policies. The process involved members from multiple teams within the Press, with various working groups forming to focus on the different aspects.

As we progressed, we needed to continually adjust and add to our plan as we came across points that we hadn’t previously considered or things that didn’t work as we had initially anticipated. Our original vision of a very streamlined and efficient peer review process, involving a single round of scorecard-based evaluation with minimal commenting, had to be modified to accommodate the practices of reviewers, who wanted to add specific granular detail to their reports, including recommendations for improvements even for those paper that were acceptable for publication, which we hadn’t originally allowed for. This necessitated an eleventh-hour elaboration and development of the planned peer review workflows. Experimental Results has undoubtedly benefitted from the more flexible model that has resulted from these changes.

We needed to make sure that our online peer review system would work with our journals production system, pulling across the right metadata to display on the front end. The first step was to work out how to capture the peer review materials. Our internal peer review team went above and beyond to set-up the review scorecards, and consent and conflict of interest statements for reviewers within ScholarOne, and to make sure that the data was exported in a form that our typesetters could use.

At an early stage we involved our external design partner, Make it Clear, in the development of an innovative presentation of the scorecard with graphic elements. The allocation of a DOI to each peer review report was an essential element of our concept. We realised along the way that this meant publishing the reviews as sub-articles within the main article PDFs, as well as in the peer review tab, which involved unforeseen work with the typesetters. We hadn’t captured content as sub-articles before or created DOIs for articles associated with a parent article, which therefore involved extensive collaboration between our peer review, content services and production team to establish how this could be implemented.

On 5th March, we published the first papers with their accompanying peer reviews. Initially there were some obvious display issues that needed resolving urgently: for example, all of the reviews unexpectedly appeared in the content listing as separate articles, with no clear distinction between articles and reviews. This required an emergency piece of technology work which soon resolved the display.

There continue to be points that need enhancing, and we are taking on board user feedback to guide us with the next stage of developments. We worked in an agile way – pushing to get a usable product as soon as we could, knowing that there would still be improvements to be made later. We are really pleased that we have got this far and feel that we made a lot of correct decisions, but it has also been a steep learning curve and our teams within the Press have needed to work flexibly along the way to adapt to, and handle various obstacles that came up.

It has been great to work as a cross-functional team, creating innovative solutions to complex requirements. We can now use and expand this functionality to other journals, and we will continue to adapt as part of our transformation to open research.

Paul Hague, Product Owner, Cambridge Core Team

“For Experimental Results we wanted a transparent display of review information, while reflecting the nature of the journal. For brevity we included a single statement from each reviewer, alongside a scorecard, where reviewers were asked to evaluate the submission in a number of areas. The challenge was to display this new information while keeping the Cambridge Core aesthetic. As well as balancing detailed information with a clean and clear style, that would display across the range of devices our uses access Cambridge Core from. We took an agile approach to development, allowing for development to be changed and added to as we explored the display and functionality. We continue to improve display and user experience following user feedback.”

Emma Pearce, Senior Content Manager, Journals Production

“It has been a great experience working on Experimental Results as decisions about the production process have required collaboration with many colleagues in other departments and have led to a really collaborative and exciting new journal.”

Leave a reply

Your email address will not be published. Required fields are marked *