[...] Please add here trends not currently captured above, and suggest possible indicators and sources for them.
Only include indicators that are themselves open, so data can be reused and results can be reproduced
Bianca Kramer, 30/05/2018 14:47
e.g.: preregistrations (OSF, clinicaltrials.gov, aspredicted.org);
journals accepting preregistered reports; journals practicing open peer review (names published, reports published);
institutions with open science aspects included in T&P policies;
institutions, funders and publishers that are DORA signatories;
publishers/journals that do not advertise impact factor on journal or article webpages;
journals accepting preprints;
preprints published (Crossref, DataCite (both with caveats)
journals being transparant about breakdown of APC costs;
Bianca Kramer, 30/05/2018 16:02
What is absolutely clear here, and needs to be dealt with, is the enormous conflict of interest apparent. As has been noted multiple times above, it is bizarre for so many of these trends and criteria to be solely focused on Elsevier-owned services, such as Scopus and Mendeley.
There are numerous reasons why this is strange. Firstly, it comes at a time when many EU member states are in conflict with Elsevier over its business strategies. Secondly, it creates a dependency on these commercial services, as well as an information bias. (1/2)
Jon Tennant, 17/06/2018 12:56
(2/2) Thirdly, Elsevier have a commercial interest in promoting its own services over those of its competitors. As evidenced above, this is a clear conflict of interest, and must be treated as such.
If researchers become locked into, or dependent on, this process, then Elsevier has more control over how the research process occurs, how content is produced and distributed, and how those are assessed.
The best approach to resolve this would be to remove Elsevier as sub-contractor, and reform with an independent expert group to provide the same service without the inherent COI.
Jon Tennant, 17/06/2018 13:00
I agree with all the comments so far. I therefore won't restate them, other that to say that my responses to previous aspects of the methdological note above provide practical examples of how there are philosophical and practical issues to using a restrictive sources as the basis for monitoring. It is an odd contradiction: monitor open science using sources that, in part, the open science movement grew in a reaction to.
George Macgregor, 18/06/2018 16:10
I agree with all of the above comments. This is an opportunity to untangle many of the issues existing in the current research ecosystem. Do not pass up this opportunity without careful consideration. This is worth doing well - even if it's difficult.
Ashley Farley, 19/06/2018 05:09
Including for profit corporations that have a financial incentive to promote their own brands, marginalise open access players and control the narrative around Open Access is counter intuitive and morally bankrupt. Elsevier should be removed as a subcontractor and all brands Elsevier controls should be removed.
Kade Morton, 24/06/2018 03:20
No mention of citation of articles in Wikipedia or core research databases. Must be added.
Egon Willighⓐgen, 01/07/2018 11:24
Thank you very much to open this step of your work to our comments. I hope you will continue to follow this track and make your methodology open and pre-registered. Please, make sure that you are not creating incentives to use a particular service by monitoring only part of the spectrum (this also includes not to rely too much on twitter...), and prefer open source initiatives over closed products.
J. Colomb, @pen, 02/07/2018 13:01
It is obvious that Elsevier has a clear conflict of interest in their role as subcontractor for OSM, and they should be removed from that position. At minima, they should be barred from using their own entities as sources. In its current state, there is no guarantee of objectivity whatsoever.
Etienne Gaudrain, 02/07/2018 13:16
Metrics are missing to measure to uptake of Open Science in industry. They must be added.
Egon Willighⓐgen, 02/07/2018 20:17
another metric I like to see added: percentages of EC grant proposals, for accepted and for rejected, published #openaccess in a journal like @RIOJournal
Egon Willighⓐgen, 03/07/2018 06:42
To echo the comments above, closed databases, and metrics derived from closed databases, are bad for open science. Relying on commercial services who have a COI is also bad for open science.
Justin Salamon, 03/07/2018 17:36
Using closed databases to track open research is rather ironic. More importantly, it is unnecessary and harmful. Please do not use Elsevier as a subcontractor.
Julian Holstein, 03/07/2018 19:30
I absolutely agree with the previous comments. Open databases are key and subcontracting Elsevier who has been opposing OA to monitor OA is counterintuitive to say the least. Thank you for the open process.
Niki Vavatzanidis, 04/07/2018 11:31
Data quality and consistency between data providers must be evaluated. Example work of this kind:
https://osf.io/938wc/ Egon Willighⓐgen, 06/07/2018 10:48
For those who might not be aware, please note that there has been a quite vigorous debate about the role of Elsevier here. All relevant links can be found in the latest here:
http://fossilsandshit.com/resp... (excuse the URL name..)
Jon Tennant, 06/07/2018 19:10
There is no mention about monitoring equality and diversity to ensure that these ongoing research and impact assessment criteria do not negatively affect any particular communities or demographics.
Jon Tennant, 10/07/2018 13:26
How will these indicators reflect disciplinary differences? For example, most of them are to do with citations, articles, and data, and therefore highly geared towards STEM subjects, to the exclusion of the social science, humanities, and arts.
Jon Tennant, 10/07/2018 13:30
Evaluating a policy is difficult and the EU is looking for a single body to manage that. Hence the choices for a more or less general body are either Scopus (european), Clarivate analytics (american), of Scholar google (american). Add a touch of lobby from Elsevier (and probably Springer) and you get the result. But the problem is that the result will be completely biased, proprietary and possibly subject to a lot of manipulations, as mentioned in the other comments. The real solution will be to have a specific, independent EU body for the follow-up that will process all the sources mentioned
Jean-Pierre Merlet, 13/07/2018 11:46
For the future: Please provide an ORCID login - not everyone in on Twitter and Facebook
Konstantin Stadler, 25/07/2018 12:35
Awareness must be raised in the OS/OI policy and research community on the relevance of gender and ways OS/OI can mitigate against gender inequality and bias in the various aspects of OS/OI.The Genderaction project has studied the links between OS/OI and gender and a report will be soon publicly available. Key findings have been published in the form of a policy brief
http://genderaction.eu/wp-cont... Marina Angelaki, 27/08/2018 13:18
The European Commission and its OSPP Expert Group, along with other stakeholders involved in research assessment (RFOs, RPOs) are encouraged to explore how/if the use of new metrics impacts men and women researchers at different career stages and disciplines differently. The choice of metrics during research assessment/evaluation procedures has important implications on researchers’ priorities and strategies, their choice of publication venue and the way they present their CVs.
Marina Angelaki, 27/08/2018 13:21
Open Peer Review: As new forms of peer review (like OPR) are being introduced it is important to examine how different practices impact on gender bias and explore ways of adequately addressing them.
Marina Angelaki, 27/08/2018 13:24
1) preprints, preregistration practices,open peer review, openlabnotebooks, protocols and openly shared workflows are completely absent from this framework but they are the core of Open Science practices
2) you need to include SSH disciplines in this picture
3) scientific blogging is also absent (see e.g. Hypotheses.org)
4) alternative research assessment adopted criteria or best practices, as collected by DORA
Elena Giglia, 30/08/2018 13:00
The picture coming out is highly biased towards STEM diciplines, English-speaking resources, and limited to only journals.
To include the SSH:
1) Create a new set of indicators specifically referring to monographs, using e.g. DOAB as a source.
2) Include specific SSH-oriented tools like Isidore, which address the multilingualism issue.
3) Track SSH-oriented projects developed by open infrastructures like DARIAH or CLARIN.
4) Include scientific blogs to track the open debate in the SSH, using e.g. Hypotheses.org.
OPERAS, 31/08/2018 09:00
I agree with many of the comments above
(Bianca Kramer and Jon Tennant) as well as others. Main point is: how can EC / EU create / use an open science monitor basing itself on one source (Elsevier). Elsevier does NOT represent openness.
Lotta Svantesson, 31/08/2018 14:53
We thank the commission for the initiative of the Open Science Monitor. This tool must be free of bias and trustable. The methodology raises several questions. We detail in a 5 pages letter our remarks and recommendations for future work. We suggest to build the next generation of open science monitor with a variety of open data sources.
Bernard Larrouturou, Director-General for Research and Innovation and President of the French Open Science Committee.
https://www.dropbox.com/s/pyi1... Bernard Larrouturou, 31/08/2018 18:20
Institutions having open science in their mission statement
Jeroen Bosman, 31/08/2018 21:58
Institutions having open science in their researcher code of conduct
Jeroen Bosman, 31/08/2018 21:59
Number of journals supporting registered reports document type
Jeroen Bosman, 31/08/2018 22:08
Number of preregistrations
Jeroen Bosman, 31/08/2018 22:37
journals indicating contributor roles
Jeroen Bosman, 31/08/2018 22:39
research funding proposals in which open science is explicitly mentioned
Jeroen Bosman, 31/08/2018 22:40
academic job ads mentioning/asking for open science experience
Jeroen Bosman, 31/08/2018 22:41
jobs for data managers, data consultants and data stewards at universities
Jeroen Bosman, 31/08/2018 22:42
Number of papers published in journals that explicitly state to select papers just on rigour not on novelty
Jeroen Bosman, 31/08/2018 22:44
number of replication studies published
Jeroen Bosman, 31/08/2018 22:44
Number of papers with a plain language abstract
Jeroen Bosman, 31/08/2018 22:47
Number of papers with abstracts in 2 or more world languages
Jeroen Bosman, 31/08/2018 22:47
number of papers stating that only open source software is used
Jeroen Bosman, 31/08/2018 22:48
number of scientific conferences posters shared openly in e.g. Figshare, Zenodo etc.
Jeroen Bosman, 31/08/2018 22:50
Number of conference presentation slides shared openly
Jeroen Bosman, 31/08/2018 22:50