We’ve had some amazing new publications recently here at ScienceOpen, and with many more in the pipeline too! For us, every paper we publish is special, and we like to highlight the effort put into them by our authors as much as possible. One of our newest addition is from the field of molecular biology and genomics, a huge and rapidly advancing research domain.
The title of the work is “About the variability, quality and reproducibility of ChIP-seq data“, and is open access of course so everyone and anyone has the opportunity to read it. The new study comes from Hinrich Gronemeyer, a well-respected researcher and Research Director at the Institute of Genetics, Cellular & Molecular Biology (IGBMC) in Strasbourg-Illkirch, and his team.
Continue reading “Next level genomics at ScienceOpen”
Many of you are probably aware of this, but as well as our main aggregation/open peer review functions, we also totally publish! We’re an Open Access publisher, and employ innovative methods of peer review around this.
We just wanted to draw your attention to some great new research that we’ve recently had the pleasure of publishing!
Continue reading “Amazing new research at ScienceOpen!”
Search engines form the core of discovery of research these days. There’s just too much information out there to search journal by journal or on a manual basis.
We highlighted in a previous post the advantages of using ScienceOpen’s dual-layered search and filter functions over others like Google Scholar. Today, we’re happy to announce that we just made it even better!
Say you want to search all of PeerJ’s content. Pop ‘PeerJ’ into the journal search, and it’ll come up with all their content, as it’s all indexed in PubMed. Hey presto, there you have 1530 papers, all with full texts attached. Neat eh! And that will update as more gets published with PeerJ, so you know what to do.
But that’s a lot of content. What you’ve just discovered is the PeerJ megajournal haystack. We want to filter out the needles.
Continue reading “Pimp my search engine”
Student evaluations in teaching form a core part of our education system. However, there is little evidence to demonstrate that they are effective, or even work as they’re supposed to. This is despite such rating systems being used, studied and debated for almost a century.
A new analysis published in ScienceOpen Research offers evidence against the reliability of student evaluations in teaching, particularly as a measure of teaching effectiveness and for tenure or promotion decisions. In addition, the new study identified a bias against female instructors.
The new study by Anne Boring, Kellie Ottoboni, and Philip Stark (ScienceOpen Board Member) has already been picked up by several major news outlets including Inside Higher Education and Pacific Standard. This gives it an altmetric score of 54 (at the time of writing), which is the highest for any ScienceOpen Research paper to date!
Continue reading “New ScienceOpen study on the effectiveness of student evaluations of teaching highlights gender bias against female instructors”