For many publishers the requirements of modern digital publishing can be dizzying – XML DTDs, PIDs, DOIs, metatags. At ScienceOpen we have been consulting publishers on their metadata for years to help get the most visibility possible for academic publications. We have increasingly built systems with our technical partner, Ovitas, to support publishers with metadata creation and distribution and made each new tool available to the next customer. As a metadata technical hub, we can automate time-consuming tasks and let publishers concentrate on the content. Here are a few of the services that we can provide to help take the pain out of publishing:
A common goal of authors and publishers has long been more readership for their publications. Traditionally, the abstract was a teaser to encourage the potential reader to buy or subscribe to read the full text. Even in an open access economy, a good abstract can trigger a coveted “download” and even more coveted citation. Why then do many publishers not make their abstracts and other metadata such as references or license information freely accessible in a machine-readable format?
Launching a new open access journal or an open access press? ScienceOpen now provides full end-to-end open access publishing solutions – embedded within our smart interactive discovery environment. A modular approach allows open access publishers to pick and choose among a range of services and design the platform that fits their goals and budget.
You want to create a unique publishing identity? Book your own sub-domain powered by ScienceOpen to manage and host existing open access publications or start new journals. ScienceOpen can provide technical infrastructure for manuscript submission, peer review management, open access hosting, article versioning, distribution, analytics and APC management for journals and (coming soon) books. The ScienceOpen platform has its own powerful citation index and is uniquely integrated with ORCID, Crossref and Altmetric to immediately plug your publications into the infrastructure of global scholarly communication.
Researchers often pay substantial sums to make the results of their research freely accessible to all. But how to let potential readers know that it’s FREE? If no one reads your open access paper, it’s like buying someone a gift certificate that they never use. So, the community has agreed on this solution:
The open access symbol signals to readers that they can expect direct and unrestricted access to published scholarly works. Originally created by PLOS, it quickly gained broad usage on publisher webpages and other sites to identify open access articles. ScienceOpen displays this open access symbol on over 4 million articles.
So how does the open access symbol get there? When a publisher publishes an article, they deposit the article “metadata” – title, authors, abstract, journal, date, URL, etc. with the central DOI service Crossref. Part of the information that they can deposit is a machine-readable Creative Commons open access license. When ScienceOpen imports the metadata information about your publication, it will get an open access symbol if our computers find an open access license associated with it. If a publisher does not deposit license information, we assume that it is not open access. It’s that simple. Continue reading “I paid $$$ – Where is my open access symbol?”
ScienceOpen understands the importance of allowing researchers to openly share their results with the scientific community at an early stage in their research. The advantage for researchers is that they get early feedback from peers but can still publish the final version in most peer-reviewed journals of their choosing.
Portrait of Albert Einstein in a museum. Source: pixabay.com
Peer Review Week, Sep 10-15, 2018
Peer Review Week is a global event celebrating the role of peer review in maintaining scientific quality. This year marks the event’s fourth anniversary of bringing together researchers, institutions, and organizations committed to the message that good peer review is crucial to scholarly communications. This year Peer Review Week on the topic of diversity aims:
To emphasize the central role peer review plays in scholarly communication
Although peer review itself is not as young as the week-long event organized in its celebration, it is still a relatively new invention. Albert Einstein published his original papers in non-peer-reviewed German journals through 1933, most famously in the Annalen der Physik. Max Planck, one of the journal’s editors of the time, described his editorial philosophy as:
To shun much more the reproach of having suppressed strange opinions than that of having been too gentle in evaluating them.
After moving to the US, Einstein was so shocked that his paper submitted to the Physical Review in 1936 was met with negative criticism that he decided not to publish with them at all. Ironically, the paper in question hypothesized that gravitational waves do not exist. In retrospect, peer review saved Einstein the controversy and the embarrassment that would have ensued if he had published his original article. Continue reading “Diverse Approaches to Peer Review”
How can something exclusive, secretive, and irreproducible be considered to be objective? How can something exclusive, secretive, and irreproducible be considered as a ‘gold standard’ of any sort?
Traditional, closed peer review has these traits, but yet for some reason held in esteem as the most rigorous and objective standard of research and knowledge generation that we have. Peer review fails peer review, and its own test of integrity and validation, and is one of the greatest ironies of the academic world.
What we need is a new standard of peer review that is suitable for a Web-based world of scholarly communication. This is to help accommodate the increasingly rapid communication of research and new sources of information, and bring peer review out of the dark (literally) ages and into one which makes sense in a world of fast, open, digital knowledge dissemination.
What should a standard for peer review look like in 2017?
The big test for peer review, and any future version of it, is how does the scientific community apply its stamp of approval?