Full steam ahead with our incredible Open Science Stars! We hope you’ve been enjoying it so far, and today we’re bringing you Dasapta Erwin Irawan, a a researcher based in Indonesia at the interface between Engineering, Hydrogeology and Geoscience, and an avid open science supporter. Enjoy his story!
When did you first hear about ‘open science’? What was your first reaction, do you remember? It’s kind of funny, I heard it first from you :). (Ed: *sniff*) It was one of your blog post in 2012 Relocation, and a chance to try some open science-ing that gave me ideas of sharing my results as fast as I can and as wide as I can. I had finished my PhD when I first read it and your posts on EGU blog. There I noticed your hash tags ‘#OpenPhD` then followed it. I wasn’t serious in using my Twitter handle for academic purposes back then. My first reaction was, to make all my published papers available online, posted them all on my ResearchGate account and my blog.
You have a very strong commitment to open science. What is it that drives this for you?
My strong commitment has been built by seeing so many other doing the same thing. In Indonesia, where not many universities have subscription to major journals, open science could be the answer of what we’ve been looking for. Everybody here keeps saying to submit papers to major paywalled journals, as they have good reputation and indexed by WoS or Scopus, while it should not be that way. What we need in Indonesia is to keep writing, write more in English and find a way to make it easier to be found and accessible by others, as if it was indexed by WoS and Scopus. And I see by using the latest free and open source services, we can do that.
In Indonesia, where not many universities have subscription to major journals, open science could be the answer of what we’ve been looking for
There are many many amazing blogs and bloggers out there that provide critical comments, context, and feedback on the ‘formally published’ research literature. One problem with these though is that they are often divorced from the papers themselves, perhaps lost on obscure websites, or not hitting the right target audience. This seems like an awful waste, don’t you think?
While some great initiatives such as The Winnower will now publish blog posts openly, these still are not connected to the papers that they are based on, if they are indeed written about particular papers. But what do researchers think about blogging as a form of scholarly communication in the form of post-publication peer review?
So as with most of my ponderings, I took to Twitter to get some feedback with a little poll. I actually framed the question a little ambiguously, but this shouldn’t sufficiently skew the data in any direction (I hope).
Do you consider blogging to be a form of post-publication peer review?
What is interesting to me is that 41% of people who answered, who undoubtedly did not constitute just a researcher sample, do not consider blogging to ‘count’ as peer review. I would really love to know why this is the case for some people. Perhaps they haven’t seen good examples, or perhaps just because it’s not formalised in any way, and quite disassociated from the research literature.
ORCID integration has been at the heart of our publishing system since our inception. We like to think that this demonstrates that ScienceOpen was already thinking way ahead of the curve for the future of publishing, and recognising the importance of infrastructure and the value of unique identifiers. ORCID is now a major part of the scholarly communications infrastructure, and becoming more so with each passing day.
At ScienceOpen, registration with us requires registration with ORCID. In fact, if you register with us, we will automatically provide you the options for registering with ORCID.
To comment, review and rate articles, we require an ORCID along with membership at ScienceOpen. If you have more than 5 articles within your ORCID profile, you’ll gain Expert member status with us, and free reign of services! We feel this is important to maintain a high standard of quality for our peer review services. This isn’t to say that those without ORCID wouldn’t be great referees, it’s just that this is an explicit minimum standard.
Here’s a little table to help make this a little easier to understand. We’re evolving all the time to adapt to the needs of the research community, so please let us know if there’s anything we can do to enhance our services!
The aim of this partnership is to standardise and integrated information that is currently distributed throughout more than 230 systems and databases in Germany. By adopting ORCID, this will support German universities and research institutes in implementing ORCID in a co-ordinated and sustainable approach.
“Thanks to the financial support from the Deutsche Forschungsgemeinschaft we have now the opportunity to promote the use of ORCID in Germany. This is a strong signal for ORCID in Germany,” says Roland Bertelmann, head of the Library and Information Services at the German Research Centre for Geoscience (GFZ).
ORCID is a critical part of research infrastructure, acting as a unique identifier for researchers, and a sort of LinkedIn style profile with your published research, and educational and professional histories embedded, and partnered with tools such as CrossRef/Scopus to make content integration easy and automated.
One main aspect of open peer review is that referee reports are made publicly available after the peer review process. The theory underlying this is that peer review becomes a supportive and collaborative process, viewed more as an ongoing dialogue between groups of scientists to progressively asses the quality of research. Furthermore, it opens up the reviews themselves to analysis and inspection, which adds an additional layer of quality control into the review process.
This co-operative and interactive mode of peer review, whereby it is treated as a conversation rather than a selection system, has been shown to be highly beneficial to researchers and authors. A study in 2011 found that when an open review system was implemented, it led to increasing co-operation between referees and authors as well as an increase in the accuracy of reviews and overall decrease of errors throughout the review process. Ultimately, it is this process which decides whether research is suitable or ready for publication. A recent study has even shown that the transparency of the peer review process can be used to predict the quality of published research. As far as we are aware, there are almost no drawbacks, documented or otherwise, to making referee reports openly available. What we gain by publishing reviews is the time, effort, knowledge exchange, and context of an enormous amount of currently secretive and largely wasted dialogue, which could also save around 15 million hours per year of otherwise lost work by researchers.
ORCID (Open Researcher and Contributor ID) is a community-based effort to provide a registry of unique and persistent researcher identifiers, and through this links to research activities and outputs. It is a powerful tool for both researchers and institutions, and can be easily integrated with CrossRef, PubMed Central, Scopus, and other data archives to populate researcher records.
Most of us, whether we are researchers or not, can intuitively grasp what “profile fatigue” is. For those who are thus afflicted, we don’t recommend the pictured Bromo Soda, even though it’s for brain fatigue. This is largely because it contained Bromide, which is chronically toxic and medications containing it were removed in the USA from 1975 (wow, fairly recent!).
Naturally, in the digital age, it’s important for researchers to have profiles and be associated with their work. Funding, citations and lots of other good career advancing benefits flow from this. And, it can be beneficial to showcase a broad range of output, so blogs, slide presentations, peer-reviewed publications, conference posters etc. are all fair game. It’s also best that a researcher’s work belongs uniquely to them, so profile systems need to solve for name disambiguation (no small undertaking!).
This is all well and good until you consider the number of profiles a researcher might have created at different sites already. To help us consider this, we put together this list.
Non-profit: independent, community driven
Publisher: Thomson Reuters
Scopus Author ID
Researcher Network: Academia.edu
Researcher Network: ResearchGate
The list shows that a researcher could have created (or have been assigned per SCOPUS) 7 “profiles” or more accurately, 7 online records of research contributions. That’s on top of those at their research institution and other organizations) and only one iD (helpfully shown in green at the top!) is run by an independent non-profit called ORCID.
Different from a profile, ORCID is a unique, persistent personal identifier a researcher uses as they publish, submit grants, upload datasets that connects them to information on other systems. But, not all other profile systems (sigh). Which leads us, once again, to the concept of “interoperability” which is one of the central arguments behind recent community disatissfaction over the new STM licenses which we have covered previously.
Put simply, if we all go off and do our own thing with licensing and profiling then we create more confusion and effort for researchers. Best to let organizations like Creative Commons and ORCID take care of making sure that everyone can play nicely in the sandbox (although they do appreciate community advocacy on these issues).
Interoperability is one good reason why ScienceOpen integrated our registration with ORCID and use their iD’s to provide researcher profiles on our site. We don’t do this because we think profiles are kinda neat, they are but they are also time consuming and tedious to prepare (especially 6 times!).
We did it because we are trying to improve peer-review which we believe should be done after publication by experts with at least 5 publications on their ORCID iD and we believe in minimizing researcher hassle. This is why our registration process is integrated with the creation of an ORCID iD, which could become pivotal for funders in the reaonably near future (so best for researchers to get on board with them now!).
So given that it seems likely that all researchers will need an ORCID iD (and boy it would be nice if they would get one by registering with us!), then what is also important is that all the sites listed in the above grid integrate with ORCID too and that hasn’t happened yet (you know who you are!). The others have done a nice job of integrating by all accounts.
In conclusion, publishers and other service providers need to remember that they serve the scientific community, not the other way around and this publisher would like to suggest that everyone in the grid please integrate with ORCID pronto!
At ScienceOpen, the research + publishing network, we’re enjoying some of the upsides of being the new kid on the Open Access (OA) block. Innovation and building on the experiments of others is easier when there’s less to lose but we are also the first to admit that life as a start-up is not for the faint hearted!
In the years since user generated comments and reviews were first introduced, those of us who strive to improve research communication have wrestled with questions such as: potential for career damage; content for peer and public audiences; comments from experts, everyone or a mix and lower than anticipated participation.
We want to acknowledge the many organizations who have done a tremendous job at showing different paths forward in this challenging space. Now it’s our turn to try.
Since launch, ScienceOpen has assigned members different user privileges based on their previous publishing history as verified by their ORCID ID. This seemed like a reasonable way to measure involvement in the field and provided the right level of publishing experience to understand the pitfalls of the process. This neat diagram encapsulates how it works.
Scientific and Expert Members of ScienceOpen can review all the content on the site which includes 1.3million+ OA articles and a very small number of our own articles (did we mention, we’re new!).
All reviews require a four point assessment (using five stars) of the level of: importance, validity, completeness and comprehensibility and there’s space to introduce and summarize the material. Inline annotation captures reviewer feedback during reading. Next up in the site release cycle, mechanisms to make it easy for authors to respond to in-line observations.
In a move sure to please busy researchers tired of participating without recognition, each review, including the subsequent dialogue, receives a Digital Object Identified (DOI) so that others can find and cite the analysis and the contribution becomes a registered part of the scientific debate.
Welcome to our wonderful world of Reviewing! Please share your feedback here or @Science_Open.