Like two arms attached to the same body (let’s pretend it’s a publishing yeti), your journal peer review and production teams have a shared objective to prepare and disseminate new articles as expeditiously as possible.
But if the people and systems that make up those two arms aren’t highly aligned, they can begin to falter, leading to delays.
Those with dispersed internal peer review and publishing teams or outsourced production vendors likely know the challenges firsthand. Think hunting down missing article information, cleaning up dirty metadata, and other unwelcome tasks.
This blog post covers tips from the Scholastica Production Service team to help you keep your peer review and production arms in sync, reducing coordination costs for everyone involved.
Ready to make your publishing yeti as efficient as possible? Let’s get to it!
Set your editors up for successful production handoffs
Work tends to go smoothly when both arms of the publishing yeti are busy with their own processes. It’s when they have to start handing files and data between different people and systems on either side that challenges usually arise.
That’s why the first step to keeping your editorial and production teams in sync is setting your editors up to make smooth handoffs of accepted submissions to production. What steps should you take? Below are recommended best practices:
- Help your authors help you: First things first, one of the most common causes for production delays is when articles are missing required accompanying files or metadata. The best way to avoid such situations is to set your authors up for success by requiring ALL necessary file upload and metadata fields in your submission form (if you’re unsure how to do this, ask your vendor!). From there, regularly review your submission form to spot additional optimization opportunities. For example, aim to continually clarify/tighten up directive language and adopt the latest Persistent Identifiers (PIDs), like ROR IDs, for research organizations. Journals using the Scholastica Peer Review System can learn more about setting up such submission form customizations and all of the PIDs we support in this blog post.
- Streamline information transfer between tools and systems: Another common cause for production holdups is when editors must manually enter manuscript details into production systems. Such processes are tedious at best and prone to error at worst. If this is the case for your team, now’s the time to explore opportunities to integrate your peer review and production systems so you can transfer manuscript files and metadata between them automatically. The best solution will look different for different publishers and depend on whether they want to outsource production or keep it in-house. Start by scheduling a meeting with your internal team to discuss manual processes and then reach out to current/potential vendors to learn how they can help from there.
- Follow a rolling production schedule: Finally, consider implementing a rolling production schedule if you haven’t already. By that, we mean sending manuscripts to production as soon as editors accept them so your production team can start working on them stat (even if you wait to publish articles in issues!). Doing so will spread out work for your production team into more manageable batches. By keeping the momentum going for accepted submissions, rather than tabling those manuscripts until all the others are ready, you’ll also reduce coordination costs for your editors (e.g., re-contacting authors to remind them of production next steps/initiate that process) and improve your author experience. (If you’re unsure whether your Production team can support a rolling workflow, ask!)
Get ahead of ethical dilemmas
Research integrity checks are paramount, including screening for plagiarism, adherence to journal ethical policies, and confirmation of authorship. And, as in the case of manuscript files and metadata, all of these criteria should be gathered and verified upfront. You don’t want the production arm of your publishing yeti to have to scramble to get things like signed author agreements when you could have gotten them sooner.
Below are some steps you can take:
- Plagiarism detection: Work with your editors to establish and regularly iterate on plagiarism definitions and processes for screening submissions. Aim to catch and address concerns of possible plagiarism before sending manuscripts out for external peer review or making decisions. Using plagiarism detection software like Similarity Check and following similarity report review best practices can make this a lot easier.
- Adherence to ethical policies: Add required affirmations to your submission form to have authors confirm they’ve read and adhered to journal ethical policies (e.g., statements of originality and disclosures, authorship guidelines, etc.).
- Confirmation of authorship: If you haven’t already, consider adopting CRediT (The Contributor Roles Taxonomy), the NISO standard. CRediTs’ set of 14 research contributor roles (e.g., conceptualization, data curation, etc.) enable authors to indicate the specific ways they worked on a submission, eliminating ambiguity around types/levels of contribution, which can help prevent authorship disputes. You can learn more about how to get started with CRediT in this webinar.
Reduce workflow redundancies
These days, it’s possible to automate SO many aspects of the journal production process to improve the accuracy and speed of your publishing yeti. But, if your editorial team isn’t aware of the opportunities available, they may be baking redundant steps into your submission guidelines and technical checks. That means more work for editors and not-so-happy authors.
To avoid this situation, check in with your internal or external production team to ensure you’re up-to-date on all the production automations currently available. For example, journals using Scholastica’s single-source production service need not worry about authors or editors pre-formatting submissions. All they have to do is send us original DOCX or LaTeX manuscript files, and our software-based process will take care of the rest, generating full-text XML articles and PDFs in the desired layout, including applicable customizations (e.g., font, section colors, etc.). We can also convert citations to major styles as needed (e.g., APA to MLA), so journals don’t have to worry about citation formatting. As a result, journals can implement a “your manuscript your way” approach to submissions, making for happier authors and a smoother publishing workflow.
Not yet implementing a single-source production process and interested in learning more? Check out Scholastica’s free ALPSP session, “Going digital-first to streamline the journey from journal submission to publication.” You can read the session recap and get the recording here.
Keep author edits in check
Sometimes, last-minute manuscript edits are unavoidable. But, in general, all revisions should be complete before sending article files to production (including copyedits!). It’s imperative to set this expectation among your journal editors during onboarding (and in internal documentation as applicable) and among your authors in your manuscript acceptance and follow-up emails.
Similarly, it’s essential to set clear expectations for proof review and communicate them to your authors and production team, including the scope of acceptable edits and firm deadlines. Then, empower your production team to be the ones to instate such requirements. That way, they can make quick calls rather than waiting to run requested deadline extensions or out-of-scope proof changes by your editors every time such situations arise. If you work with an external vendor, start a conversation with them about ways they can help.
Collect metadata that meets the requirements of your article destinations
From content registration databases to archives to indexes, there are myriad destinations to send published articles these days, and it all depends on one thing — quality metadata. You can save everyone in your publishing team time by prioritizing collecting metadata that meets the requirements of all those destinations upfront. Key steps to take include:
- Requiring submission form fields for article-level metadata (RE help authors help you!)
- Adopting standard PIDs (i.e., Crossref DOIs, ORCID iDs, ROR IDs, etc.)
- Verifying PID inputs automatically via your submission form (ask your vendor about available options if you’re not sure)
Why? Errors that arise after submitting articles to preservation and discovery services can take weeks to surface and then weeks to correct.
For example, you can prevent potential Crossref DOI registration errors by producing JATS XML article-level metadata that adheres to Crossref minimum/maximum character restrictions (e.g., for title, abstract, etc.). Talk to your production team about this!
Additional steps that your peer review system should support include verifying the funder IDs authors input into your submission form using the Crossref Funder Registry, verifying and authenticating authors’ ORCID iDs, and associating authors’ institutions with ROR IDs.
One more tip from the Scholastica Production Team: Ensure your submission form fields correlate with non-PID metadata inputs required by archives and indexes. For example, PubMed Central requires XML files to include “article type” metadata that conforms with its standard options. That being said, if you have your own “article type” taxonomy or allow authors to input custom options, that’s sure to lead to metadata cleanup work for your Production team.
Treat errors like bugs and aim to fix them systemwide
In software development, we talk a lot about “fixing bugs.” Not the creepy crawly kind. In this context, bugs refer to faults in the design of a system that can cause unexpected or incorrect results. To successfully eliminate bugs, developers must get to the root of those issues.
Even in the most perfectly designed systems, bugs are sure to crop up occasionally because technology requirements are ever-evolving. And scholarly publishing systems are no exception, with countless current and emerging rules to follow, like new DTD and metadata standards.
That said, when your production team encounters technical issues, such as metadata deposits to archiving and indexing services returning errors like “ORCID iDs are invalid,” it’s essential to treat them like bugs and find/fix any faults in your current system. Otherwise, those same bugs will reoccur (and no one wants to spend hours squishing them individually!).
For example, let’s say you get an “ORCID iDs are invalid” error when registering a DOI with Crossref after typesetting. In addition to fixing the individual error and resubmitting the article-level metadata to Crossref, treat the error as a bug by finding the underlying cause within your current system (e.g., when an author mistypes their ORCID in your submission form). In this case, adding ORCID verification to your submission form can help you prevent invalid inputs. Going further, you could implement ORCID authentication to ensure you get the correct ORCID iD for each researcher (a feature we recently added to the Scholastica Peer Review System).
Keep open lines of communication
Above all, it’s critical to cultivate a collaborative relationship with your production team — even if you outsource to a vendor. For example, at Scholastica, we have onboarding calls with all of our Production customers to explain our typesetting process and get to know their unique needs so we can help optimize peer review to production handoffs.
From there, keep open lines of communication. Share feedback with in-house production staff or external vendors about any papers that didn’t go smoothly (e.g., catch those bugs!). And always look for opportunities to keep iterating on your individual peer review and production processes. We cover tips to streamline peer review here and production here.
Follow these steps, and your publishing yeti will go from abominable to unstable!
Have a question or additional insights to share? Join the conversation by posting in the blog comments. You can find Scholastica on LinkedIn, X (formerly Twitter), and Facebook.