Welcome to this Special Issue of EuroScientist on: Research evaluation!
The very existence of scientific career progression hinges on researchers being judged by their peers as being eligible for further career advancement and worthy recipient of grant funding.
Yet, technology is bringing disruption in what was until now a well-oiled peer-review system.
The upcoming generation of scientists is likely to be evaluated through an evolved versions of peer-review. New means of measuring their worth have a much higher level of granularity than the old-fashioned impact factor approach.
Thus, scientists are entering a period of increased uncertainty, when it comes to the evaluation of their abilities, until new evaluation methods become validated and widely accepted by the community.
This interim period could bring some opportunities. It opens the door to testing alternative metrics, so-called altmetrics, for example. And it offers opportunities for real-time evaluation of researcher’s latest work, thanks to the instant online availability of papers, feedback from the community and from reviewers as well as publishers.
Ironically, what has now become the experimental field of evaluation may have the unforeseen consequence of speeding the research process itself because technology helps cut down delays in the evaluation process.
Enjoy this special issue, and don’t forget to provide feedback to us by writing to editor@euroscientist.com or by commenting in the dialogue boxes at the bottom of each article.
Editorial
Evaluation moves in mysterious ways
By Sabine Louët, editor EuroScientist.
Alternatives to peer review
Mentors, mates or metrics: what are the alternatives to peer review?
By Arran Frood, freelance science journalist.
See also our previous coverage on related topic
Navigating the altmetrics maze
By Anthony King, freelance science writer.
Initiatives to validate altmetrics
By Sabine Louët, editor EuroScientist.
New methods of evaluating research
Evaluation: dogma of excellence replaced by scientific diversity
By Francesco Sylos Labini, ROARS.
The academic evaluation conundrum, exclusive interview of Mary Phillips, Academic Analytics
By Sabine Louët, editor EuroScientist.
How well do academic react to being measured?
By Martin Ince, Science and education writer.
Collaborative open science speeds up research evaluation
By John Hammersley, writeLaTeX.
Featured image credit: © style-photography.de - Fotolia.com
You like what you read EuroScientist?
Why not help us bring more of these issues to you.
It is simple, all you have to do is follow this link, and donate to support us!