[...] TP1/TP10 - Top1/Top10 percentile highly cited publications (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire)
Consider SemanticScholar, CiteSeerX and CORE as sources.
Federico Leva, 17/06/2018 17:11
Publisher-controlled sources should be replaced by CrossRef data, which is already collected from publishers. Only open citation data should be considered (see
https://i4oc.org/ ).
Federico Leva, 17/06/2018 17:12
As per some of the comments already, open approaches should be the preferred option here. Open Science Monitor has a unique opportunity to promote a completely different measure of scientific impact. By promoting open metrics the OS Monitor could support the move away from the "traditional" metrics of Scopus and so forth.
George Macgregor, 18/06/2018 15:59
I agree with George - this cannot be stressed enough. Signals are incredibly important in this ecosystem at this time. Sending the wrong signal could be extremely detrimental to the critical work of past and current individuals building open science up for the benefit of research not profit margins.
Ashley Farley, 19/06/2018 04:40
Fields should not be static in an multidisciplinary research community.
Egon Willighⓐgen, 01/07/2018 11:13
Non-open sources, or especially sources that are controlled by the OSM should be removed to prevent conflict of interest.
Etienne Gaudrain, 02/07/2018 12:28
I think, if anything, what high citations have to do with open science, which is about moving away from traditional evaluation metrics, should be justified here.
Jon Tennant, 06/07/2018 18:46
Come on! Do you really want to measure Open science with rankings and percentiles? Open Science means also alternative evaluation. Journal rankings have hugely damaged science and genereated the serial crisis which was at the start of the Open Access movement! Do you really want to measure openness with rankings? Unbelievable. It happens as your subcontractor is the vendor of a ranking-based tool. There are plenty of EU reports stating that research evaluation needs to take into account and reward open choices by authors. See
https://ec.europa.eu/research/... Elena Giglia, 30/08/2018 12:41
Measuring Open Science, which tends to be an alternative paradigm to the current scholarly communication system, by rankings and percentiles is a complete nonsense. Applying old categories and indicators to new ways of publishing might result in a distorted - and useless - picture. Moreover, the journal ranking system contributed to create the so called "serial crisis" which was at the beginning of the Open Access movement, so it would be ironical measuring Open Access with the same indicator it was born to fight. And, again, books are cut off from citation counts.
OPERAS, 31/08/2018 08:53