Help us improving the indicators!

Last update 09 Oct 2018
64 paragraphs, 291 comments

Help us improving the indicators!

To add a comment on each indicator or suggest new indicators, just click on the bubble alongside each line.

Open access to publications

Green and gold open access

P - # Scopus publications that enter in the analysis* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire)
Why restrict only for Scopus?
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:54
It is not relevant to use a paid database to measure open science (or in this case open access articles), 1) Scopus only index journals based on registration and include the ones that fit with their criteria, many journals from global south (like Indonesia where all journals are OA) are not included, 2) Scopus dbase contains mostly English-based journals. Again in this case, very few journals from Global South are listed, 3) Scopus does not index repositories. 1/2
highlight this | hide for print
Dⓐsapta Erwin Irawan
Dⓐsapta Erwin Irawan
- 17 Jun 2018 16:00
Scientifc community can always go to Google Scholar, Base, Share database (and now Dimension) to get articles that are not listed in Scopus dbase. Promoting the usage of paid dbase to this project can lead to a more damage to publishing system in non US/EU countries, because they tend to follow the policy from western countries. 2/2
highlight this | hide for print
Dⓐsapta Erwin Irawan
Dⓐsapta Erwin Irawan
- 17 Jun 2018 16:04
I would agree with Dasapta's comments here. By focusing on a single indexing & abstracting tool Open Science Monitor is immediately excluding a large corpus of relevant data and content. No indexing & abstracting service indexes all known literature so a multple system approach is a necessity.
highlight this | hide for print
George Macgregor
George Macgregor
- 18 Jun 2018 15:23
Such approach may require the inclusion of DS Dimensions, 1findr (both of which are open) and WoS. Appreciate Google Scholar and MS Academic might be difficult to gather meaningful data from - but worth investigating? Whatever the final approach, something more holistic in scope would benefit OS Monitor, researchers and other users.
highlight this | hide for print
George Macgregor
George Macgregor
- 18 Jun 2018 15:50
I agree with the other comments that paid/subscription databases should not be used in this analysis (or be the sole source). This is a massive opportunity to espouse the very open science practices that funders and institutions are aiming to encourage or mandate researchers to adopt.
highlight this | hide for print
Ashley Farley
Ashley Farley
- 19 Jun 2018 00:34
The source databases should be (1) open, i.e. not behind a paywall, and with a clear, accessible methodology; (2) independent, i.e. not driven by a group that would benefit from it, but done in the public's interest, which probably means maintained by a consortium including universities; (3) multiple, as cross validation is key in capturing reliable trends. As such, Scopus has to be excluded.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:09
As an open science advocate, I will not have a elsevier credential, I just erased my mendeley account for that reason. So if it is in Scopus, I would tend to think it is not open science. BTW: a login with orcid instead of twitter would have been nice...
highlight this | hide for print
J. Colomb, @pen
J. Colomb, @pen
- 02 Jul 2018 12:19
To echo the comments above: closed databases are bad for open science. Please do not rely on closed databases (e.g. Scopus). Thanks.
highlight this | hide for print
Justin Salamon
Justin Salamon
- 03 Jul 2018 17:31
Using a paid database seems problematic for promoting open communication within the scientific community... The other comments have provided reasons for the above claim.
highlight this | hide for print
Qingyang Xi
Qingyang Xi
- 03 Jul 2018 18:24
I do not feel that making "# of Scopus publications" the basis for assessing the corpus of scholarly publications will result in a fair assessment of % of Open Access. Scopus requires that a journal publish for at least 3 years and then goes through a rigorous process where one of the criteria is whether inclusion of a journal will increase or reduce the reputation of the Scopus product. VERY many open access journals supported by university libraries are not indexed in Scopus. Many open access publications are new or experimental and would never be indexed in the Scopus database.
highlight this | hide for print
Stephanie Dawson
Stephanie Dawson
- 06 Jul 2018 18:19
In addition to all of the above, it should be questioned why Web of Science is not being used for this process, especially now that it provides OA status after integrating with Unpaywall. WoS was also used for the first version of the monitor, so it is unclear why this switch was made, instead of using both services.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 18:38
I agree that the choice of Elsevier as a subcontractor is very worrying, since their entire business model is either non-OpenSource, or very-expensive-OpenSource, both of which are counterposed to the openness of science, and draw important resources away from good research. It would be more credible to use a more independent subcontractor, which then can choose a more balanced set of resources.
highlight this | hide for print
Bernhard Englitz
Bernhard Englitz
- 18 Jul 2018 13:40
get rid of Elsevier ! There data are fake : - "it is critical to understand that the Journal Impact Factor has a number of well-documented deficiencies as a tool for research assessment" SanFrancisco Declaration - ‘Predatory’ open access: a longitudinal study of article volumes and market characteristics Cenyu Shen and Bo-Christer Björk BMC Medicine201513:230 DOI: 10.1186/s12916-015-0469-2 - https://openarchiv.hypotheses.... - https://openarchiv.hypotheses....
highlight this | hide for print
odilehennaut
odilehennaut
- 18 Jul 2018 16:00
Great initiative of the EU to establish an open science monitor. However, in its current form their are some troublesome limitations: 1) # Scopus publications: This point is not clear, as in brackets also other sources are given. I sincerely hope that the EU is not considering only scopus listed publications for the monitor, as several open science journals are not listed in Scopus. For example: "Journal of Open Source Software" and "Journal of Open Research Software".
highlight this | hide for print
Konstantin Stadler
Konstantin Stadler
- 25 Jul 2018 11:43
2) Transparency and Open Access are among the most important benefits of Open Science. In contrast, the Scopus database can only be accessed with Elsevier credentials, making it impossible for Citizens Scientist without institutional access to check and validate any results from the monitor. In the light of current events, this might also apply to German and Swedish scientist in general!
highlight this | hide for print
Konstantin Stadler
Konstantin Stadler
- 25 Jul 2018 11:47
Being a paid database and using non-inclusive criteria are not the only concerns here. Using Scopus as the primary basis for assessment will lead to disciplinary inequalities as well. The low coverage of Arts and Humanities journals in Scopus had been pointed out several times in previous studies. For instance, Mongeon and Paul-Hus (2016) (DOI: 10.1007/s11192-015-1765-5) concludes that the Scopus coverage of Arts and Humanities journals (compared to Ulrich’s extensive periodical directory) is less than 20%.
highlight this | hide for print
Erzsébet Tóth-Czifra
Erzsébet Tóth-Czifra
- 25 Jul 2018 16:21
Also, in Arts and Humanities disciplines article citations are much less dominant indicators of excellence than in STEM. Therefore the inclusion of DOAB to the sources would contribute to a bit more balanced picture.
highlight this | hide for print
Erzsébet Tóth-Czifra
Erzsébet Tóth-Czifra
- 25 Jul 2018 16:21
Assessing life sciences with Scopus and WoS would be acceptable. But social, educational, law, computational and economic sciences are poorly indexed by these databases. Indeed, history, educational, social sciences produce mainly book chapter, books and reviews, type of document poorly indexed by Scopus & WoS. Computational sciences produce mainly conference proceedings that are also poorly indexed by those databases. Relying heavily on local context such as history or human geography are not published in English international journals, and not indexed by Scopus & WoS. Therefore, source
highlight this | hide for print
Sylvie Vullioud
Sylvie Vullioud
- 28 Jul 2018 15:45
2 Therefore sources for OA monitoring should be selected by domain. One size fits all is scientifically irrelevant.
highlight this | hide for print
Sylvie Vullioud
Sylvie Vullioud
- 28 Jul 2018 15:46
Using the proprietary and for profit Scopus to monitor open science is like asking a fox to guard the henhouse in order to check the freedom of the chickens.
highlight this | hide for print
M.Chiara Pievatolo
M.Chiara Pievatolo
- 22 Aug 2018 12:53
Scopus not the right source: a) to be included you need to have at least 3 issues, so for yearly journals it means 3 years b) monographs are never included into Scopus, and the coverage of Social Science and Humanites is not so wide. Using just Scopus means cutting off all half of the scientific world. At least DOAB (Directory of Open Access Books) need to be used, alongside with at least DOAJ (directory of Open Access Journals) c) Scopus is a proprietary database, so it can't be the only source. Morevoer, it's owned by Elsevier, which has been lobbying for years against Open Access
highlight this | hide for print
Elena Giglia
Elena Giglia
- 30 Aug 2018 12:22
Unacceptable restriction to Scopus. At the very least it should be complemented by Clarivate Analytics data.
highlight this | hide for print
Marc VANHOLSBEECK
Marc VANHOLSBEECK
- 30 Aug 2018 16:55
If this indicator is meant to measure all the "publications" in any discipline and language and not only "journal articles", the use of Scopus is highly misleading. Monographs are not included in Scopus nor are the vast majority of journals published in the Social Sciences and Humanities. Scopus is English-speaking oriented, thus excluding publications in other languages. The use of Scopus cuts off half of the disciplines in the ERA. Moreover, Scopus is a proprietary database, so it should never been used as the only source.
highlight this | hide for print
OPERAS
OPERAS
- 31 Aug 2018 08:49
If we want Open Access results then we must ensure that the infrastructure underpinning them is built on the same principles. Placing the monitoring of Open Access output exclusively in the hands of a closed, for-profit service that ignores large swathes of academic output (e.g. non-English, AHHS-fields) not only undercuts the monitor's results; it will damage the credibility of the EU open policy goals themselves.
highlight this | hide for print
Andrea Hacker
Andrea Hacker
- 31 Aug 2018 10:29
We all are aware of the importance (and power) of Scopus and WoS, but when will the EU open itself really to alternative indicators?
highlight this | hide for print
Delfim Leão
Delfim Leão
- 31 Aug 2018 10:32
It is stated in the draft methodolological note that "the study covers all research disciplines, and aims to identify the differences in open science adoption and dynamics between diverse disciplines" (p. 6) I think it is hard to find someone today argumenting that one data source can accomplish this, a variety of sources is needed (and preferably open as stated in comments above). To pick one will take us back to the problem of searching the key where the light is, and this is not in tune with recent developments in the scientometric indicators field such as the Leiden manifesto (principle 6)
highlight this | hide for print
Camilla Lindelöw
Camilla Lindelöw
- 31 Aug 2018 11:02
Why are all the sources listed for indicator P when, as I understand it, it will be based on Scopus solely? The same goes for the indicators TCS, FWCI, TP1/TP10.
highlight this | hide for print
Camilla Lindelöw
Camilla Lindelöw
- 31 Aug 2018 11:08
I agree with many of these comments above (and I do not appreciate to have to login to make a comment, on top of that either with Facebook or Twitter). It is not good to base this only on Scopus (Elsevier).
highlight this | hide for print
Lotta Svantesson
Lotta Svantesson
- 31 Aug 2018 12:10
Two main problems with Scopus as baseline for total pubs: it is not transparent and its coverage it not sufficient in terms of languages, document types and fields
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:23
P(oa) - # Scopus publications that are Open Access (CWTS method for OA identification)* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire)
Why only Open Access in Scopus and not all articles that are open access?
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:54
Because Elsevier are the sub-contractor for this.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 17 Jun 2018 12:49
It is good to observe important open science tools, such as DOAJ, ROAD, OpenAire, etc. being proposed here; but I worry that there is insufficient emphasis on Green OA. For example, only OpenAIRE will surface data about Green OA, yet OpenAIRE's coverage may be limited. Is probably necessary to seek inclusion of IRs or aggregations outside OpenAIRE infrastructure.
highlight this | hide for print
George Macgregor
George Macgregor
- 18 Jun 2018 15:27
Publication sources cannot include Scopus. If Elsevier, subcontracted in this monitoring action, had its own open science agenda, they could alter their own criteria for inclusion of Open Access publications in Scopus and steer conclusions about OA. There is clear, immediate conflict of interest.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:15
What is the CWTS method for Open Access identification? According to the note, this is a combination of 5 different sources (DOAJ, PMC, ROAD, CrossRef, and OpenAIRE). As mentioned above, these data can be obtained from a single services, such as Web of Science and Unpaywall. AGain, it is unclear why Clarivate's services has been replaced with the more complex Elsevier one. Will these data, matching algorithms, and other methods be made available?
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 18:42
I don't now what the CWTS method is, but I agree with the comment from Jon Tennant above (https://www.makingspeechestalk... ): Its troublesome that tax-money is used to reinvent the wheel when the data is readily available by other services.
highlight this | hide for print
Konstantin Stadler
Konstantin Stadler
- 25 Jul 2018 12:02
Also, I hope that CWTS is better than the one currently applied by Elsevier/Scopus, as many OA articles are not reported OA in Scopus or not listed at all (examples can be provided). There is a great risk to greatly underestimate the role of OA by the currently proposed methodology.
highlight this | hide for print
Konstantin Stadler
Konstantin Stadler
- 25 Jul 2018 12:08
Further to my comment above, I want to draw attention to a recent publication, in which two of the authors are from the CWTS: https://osf.io/preprints/socar... Here, this study uses the same sources, with the exception that it uses Google Scholar (free) instead of Scopus (paid). Why is this not considered here, when members of the same research group are using the method, and clearly to the same effect? They are able to adequately and accurately assess publisher-based OA proportions for the different 'types', and delineate the data also based on discipline and country.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 26 Jul 2018 12:06
Using this indicator will result in a very strong STEM-journal-English speaking bias. At least DOAB (Directory of Open Access Books) should be used in order to include Open Access books.
highlight this | hide for print
OPERAS
OPERAS
- 31 Aug 2018 08:49
Monographs are absent or at least poorly represented, thus leaving SSH production clearly unrepresented. Strongly advise to use at least DOAB references.
highlight this | hide for print
Delfim Leão
Delfim Leão
- 31 Aug 2018 10:37
P(green oa) - # Scopus publications that are Green OA* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire)
Why restrict only for Scopus?
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:55
I would actually refer anyone reading this to my previous comment, because it addresses this: "It is good to observe important open science tools, such as DOAJ, ROAD, OpenAire, etc. being proposed here; but I worry that there is insufficient emphasis on Green OA. For example, only OpenAIRE will surface data about Green OA, yet OpenAIRE's coverage may be limited. Is probably necessary to seek inclusion of IRs or aggregations outside OpenAIRE infrastructure."
highlight this | hide for print
George Macgregor
George Macgregor
- 18 Jun 2018 15:30
But, in addition, by using only Scopus a restrictive - and un-openscience publication ethos - will be imposed on Open Science Monitor. After all, Scopus is only a subset of published literature. An important ethos of open science is promoting the publication of research that might sit outside the scope of Scopus or WoS. The drive to publish in journals indexed by these sorts of services distorts healthy scholarly communication and is precisely why open science principles have been adopted by researchers. Therefore....
highlight this | hide for print
George Macgregor
George Macgregor
- 18 Jun 2018 15:35
....Wouldn't Open Science Monitor simply be reinforcing the bias of Scopus within its open science monitoring system?
highlight this | hide for print
George Macgregor
George Macgregor
- 18 Jun 2018 15:35
Publication sources cannot include Scopus. If Elsevier, subcontracted in this monitoring action, had its own open science agenda, they could alter their own criteria for inclusion of Open Access publications in Scopus and steer conclusions about OA. There is clear, immediate conflict of interest.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:16
get rid of Elsevier ! There data are fake : - "it is critical to understand that the Journal Impact Factor has a number of well-documented deficiencies as a tool for research assessment" SanFrancisco Declaration - ‘Predatory’ open access: a longitudinal study of article volumes and market characteristics Cenyu Shen and Bo-Christer Björk BMC Medicine201513:230 DOI: 10.1186/s12916-015-0469-2 - https://openarchiv.hypotheses.... - https://openarchiv.hypotheses....
highlight this | hide for print
odilehennaut
odilehennaut
- 18 Jul 2018 16:02
Having said that using Scopus is not fair as it's proprietary and immediately opens a conflict of interest, counting the number of green publications included in Scopus is absolutely limitating. You will exclude all the monographs, the book chapters, the SSH journals not included into Scopus and all the emerging communication tools as preprints. "Green" open access is a wider concept, measuring it against Scopus is methodologically flawed. What about all the preprints included in arXiv and not yet published in a journal? They are Green OA and won't be counted excluding also Phyisics. Absurd.
highlight this | hide for print
Elena Giglia
Elena Giglia
- 30 Aug 2018 12:30
Unacceptable restriction to Scopus (for all the reasons mentioned in the previous comment). At the very least it should be complemented by Clarivate Analytics data.
highlight this | hide for print
Marc VANHOLSBEECK
Marc VANHOLSBEECK
- 30 Aug 2018 16:57
P(gold oa) - # Scopus publications that are Gold OA* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire)
Why restrict only for Scopus?
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:55
Publication sources cannot include Scopus. If Elsevier, subcontracted in this monitoring action, had its own open science agenda, they could alter their own criteria for inclusion of Open Access publications in Scopus and steer conclusions about OA. There is clear, immediate conflict of interest.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:16
If Scopus uses DOAJ and ROAD to assess the OA status of journals, should not these primary data sources be used instead? Unclear here what additional benefit using Scopus has. https://blog.scopus.com/posts/...
highlight this | hide for print
Jon Tennant
Jon Tennant
- 25 Jul 2018 10:53
Again, this indicator does not take into account all the monographs indexed in the DOAB-Directory of Open Access Books, most of which are Gold OA. It also excludes all the university-based Open Access journals which are Gold even though APC-free. It scarcely represents SSH disciplines as their journals are not indexed in Scopus.
highlight this | hide for print
OPERAS
OPERAS
- 31 Aug 2018 08:51
PP(oa) - Percentage OA publications of total publications* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire)
Replace all sources with Unpaywall/ImpactStory, which already combines all the necessary/good ones.
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:13
Remove sources that may suffer from conflict of interest: Scopus.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:17
I think BASE is already getting good data about this question, although it may all come from crossref (?), may be worth to have a look at scienceopen data, too
highlight this | hide for print
J. Colomb, @pen
J. Colomb, @pen
- 02 Jul 2018 12:41
Elsevier products (Scopus) have pretty obvious conflicts of interest?
highlight this | hide for print
Qingyang Xi
Qingyang Xi
- 03 Jul 2018 18:27
Again, if you consider only Scopus - with all the proprietary/conflict of interest limitations already noticed - all these indicators should be named not "publications" but "journal articles" percentages.
highlight this | hide for print
Elena Giglia
Elena Giglia
- 30 Aug 2018 12:32
PP(green oa) - Percentage gold OA publications of total publications* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire)
Replace all sources with Unpaywall/ImpactStory, which already combines all the necessary/good ones.
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:13
PP(gold oa) - Percentage green OA publications of total publications* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire)
Replace all sources with Unpaywall/ImpactStory, which already combines all the necessary/good ones.
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:13
Remove sources that may suffer from conflict of interest: Scopus.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:17
What would be in the spirit of Open Science monitoring would be to take into account the diversity of research outputs that are accessible via repositories. Using Scopus for Green OA monitoring is IMHO almost an oxymoron.
highlight this | hide for print
Marc VANHOLSBEECK
Marc VANHOLSBEECK
- 30 Aug 2018 16:59
TCS - Total Citation Score. Sum of all citations received by P in Scopus. (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire)
What about other sources like Google Scholar?
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:55
Also Dimensions or Web of Science. Constraining this to one source, that just happens to be owned by the sub-contractor, is an exceptionally bad practice.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 17 Jun 2018 12:50
Agreeing with the comments: WoS, Dimensions, and Google Scholar must be added.
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:10
Other than that, citation counts have nothing to do the the form of Open Access.
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:12
And OpenCitations is missing and must be added.
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:12
Number of citations is a very tricky measure because it is associated with a time window: it takes some time for papers to be cited, which is largely dependent on the field of research, and on the type of findings. For instance, some truely groundbreaking work may take years to be understood by the community and only becomes cited after a long period of time, while some more minor incremental work may get highly cited if it is published right when a topic is most popular. In other words, number of citation in a short period of time (e.g 2 years) is a poor metric of how useful a publication is.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:24
Dealing with citations you can't stick only to Scopus. Web of Science and other databases show different numbers. And, again, you are referring only to journal articles and not to "publications" in general, and only to STEM disciplines
highlight this | hide for print
Elena Giglia
Elena Giglia
- 30 Aug 2018 12:35
Using a single, proprietary database to measure citations is a wrong method. All other available databases or tools should be added. It has to be underlined that again monographs and most of the SSH journals are excluded from any citation database.
highlight this | hide for print
OPERAS
OPERAS
- 31 Aug 2018 08:52
Questions is what kind of statements you intend to base on this variable. To what extent is this valuable for assessing the adoption, penetration or effect of open science?
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:31
FWCI – Field Weighted Citation Score. (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire)
FWCI solves some of the issues of TCS, but inherent biases make it very hard to interpret those numbers.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:25
A common issue with FWCI is that the definition of fields is generally perfectly arbitrary, and tends to be based on historical separations.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:26
It could be "improved" via trying to tag articles via machine learning...
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 26 Jul 2018 12:55
TP1/TP10 - Top1/Top10 percentile highly cited publications (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire)
Consider SemanticScholar, CiteSeerX and CORE as sources.
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:11
Publisher-controlled sources should be replaced by CrossRef data, which is already collected from publishers. Only open citation data should be considered (see https://i4oc.org/ ).
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:12
As per some of the comments already, open approaches should be the preferred option here. Open Science Monitor has a unique opportunity to promote a completely different measure of scientific impact. By promoting open metrics the OS Monitor could support the move away from the "traditional" metrics of Scopus and so forth.
highlight this | hide for print
George Macgregor
George Macgregor
- 18 Jun 2018 15:59
I agree with George - this cannot be stressed enough. Signals are incredibly important in this ecosystem at this time. Sending the wrong signal could be extremely detrimental to the critical work of past and current individuals building open science up for the benefit of research not profit margins.
highlight this | hide for print
Ashley Farley
Ashley Farley
- 19 Jun 2018 04:40
Fields should not be static in an multidisciplinary research community.
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:13
Non-open sources, or especially sources that are controlled by the OSM should be removed to prevent conflict of interest.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:28
I think, if anything, what high citations have to do with open science, which is about moving away from traditional evaluation metrics, should be justified here.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 18:46
Come on! Do you really want to measure Open science with rankings and percentiles? Open Science means also alternative evaluation. Journal rankings have hugely damaged science and genereated the serial crisis which was at the start of the Open Access movement! Do you really want to measure openness with rankings? Unbelievable. It happens as your subcontractor is the vendor of a ranking-based tool. There are plenty of EU reports stating that research evaluation needs to take into account and reward open choices by authors. See https://ec.europa.eu/research/...
highlight this | hide for print
Elena Giglia
Elena Giglia
- 30 Aug 2018 12:41
Measuring Open Science, which tends to be an alternative paradigm to the current scholarly communication system, by rankings and percentiles is a complete nonsense. Applying old categories and indicators to new ways of publishing might result in a distorted - and useless - picture. Moreover, the journal ranking system contributed to create the so called "serial crisis" which was at the beginning of the Open Access movement, so it would be ironical measuring Open Access with the same indicator it was born to fight. And, again, books are cut off from citation counts.
highlight this | hide for print
OPERAS
OPERAS
- 31 Aug 2018 08:53
Any additional indicator (and source)?
Number of open access works whose referencs are also all open access (idea credit: John Dove). Source: iterate over open citation data at CrossRef and merge with Unpaywall data.
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:37
Number of academic entities publishing their subscription spending. Possible source: fund OpenAIRE to do it. Current sources: see e.g. http://stuartlawson.org/2016/0...
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 18:07
The greatest omission here is that it doesn't look in to basic Open Science rights: I want to see metrics that allow reuse and redistribution. Metrics should be added to distinguish between "allow reuse, modification, and redistribution" and "does not allow reuse, modification, and redistribution".
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:15
Access from developping countries for OA publications vs. non-OA publications. There is a chance this can only be evaluated through contact with the individual publishers, or through online surveys.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:31
Maybe worth contacting universities that need to produce this data too (most of them have contracts to be above 60% OA in 2020 and are finding ways to monitor that).
highlight this | hide for print
J. Colomb, @pen
J. Colomb, @pen
- 02 Jul 2018 12:45
Survey addressed to scientists on reception of OA across scientific fields: Do you trust OA more or less than non-OA? Do you benefit from accessing OA publications? Do you benefit from publishing OA?
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:52
Survey addressed to scientific journalists, patient organisations, or in general citizens concerned with scientific topics, on the role that OA plays for them in order to evaluate whether OA contributes to making scientific findings available to the society at large.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:54
I not quite sure how are "open access" being defined here, but is arxiv.org taken into account ? In my fields of research this is the main source of information nowadays (not the publications in journals).
highlight this | hide for print
Rodrigo Benenson
Rodrigo Benenson
- 05 Jul 2018 13:28
How about a breakdown of the cost associated with meeting OA in all of the above indicators. This could include by country, by publisher, by journal etc.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 18:44
Institutional OA repositories are very useful to monitor Gold, Green, Platinum and hybrid OA. Today, many more researchers have an ORCID and some institutions also have ORCID. What about pushing for ORCID and Unpaywall combination for OA monitoring? They are both non-profit raising organizations.
highlight this | hide for print
Sylvie Vullioud
Sylvie Vullioud
- 28 Jul 2018 15:51
2/3 In fact, non profit tools should be used as the ground of OA monitoring calculation: ORCID, Unpaywall, Zenodo (non-profit EU supported OA and ORD database), institutional databases, and the huge thematic medical PMC and EuropePMC databases. Commercial Scopus and WoS databasesshould may be used to "check" the results at the end.
highlight this | hide for print
Sylvie Vullioud
Sylvie Vullioud
- 28 Jul 2018 15:59
3/3 By doing so - OA monitoring would *motivate* researchers to create ORCID, and fill in their institutional/thematic databases. - It would weak IF and h-index for carrier assessment and be compatible with DORA signatories - it would allow raise of OA without mandatory national or university laws - The calculation at the beginning maybe not so accurate, but with *time* it will improve, without any *dependance* to commercial Scopus and WoS. They would be only other tools among others for a fair competitive market.
highlight this | hide for print
Sylvie Vullioud
Sylvie Vullioud
- 28 Jul 2018 16:02
1) ORCID and Unpaywall association is a good suggestion 2) DOAB for monographs 3) DOAJ to include also SSH journals 4) preprints servers 5) tracking the costs, e.g. via OpenAPC https://treemaps.intact-projec...
highlight this | hide for print
Elena Giglia
Elena Giglia
- 30 Aug 2018 12:46
I agree with the comments above regarding preprints. Consideration of preprints is important and it is not clear how they are included. In addition, only 4 sources of preprints are used, which is quite limiting. OSF preprints and Preprints.org (disclosure - I am director of the latter) are not included.
highlight this | hide for print
Martyn Rittman
Martyn Rittman
- 30 Aug 2018 15:51
It's disappointing to see that the only outlets mentioned, at least as far as I could discern, are journals. In many disciplines the or a main form of publication is the book - whether single-authored, a collection of related papers on a given theme, or a specialist catalogue or textual edition and commentary. It would be good to see this form of publication included here. I speak both in a personal capacity and as a member of the UUK OA monograph working group.
highlight this | hide for print
Nigel Vincent
Nigel Vincent
- 31 Aug 2018 15:48
I'd echo Nigel's comment (also as a member of the UUK OA monographs working group), particularly given that significant funders are signalling intent to develop OA policy in the direction of long form publications.
highlight this | hide for print
Chris Banks
Chris Banks
- 31 Aug 2018 17:03
Publications with open licenses; publications for which a preprint is available; publications with open peer review reports; publication with open peer review identities; publications with open peer review editor/reviewer/author communication
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:36
Publications of which the citation data are fully open and complete (Crossref)
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:38
OA availability of sources cited in the papers
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:44
researchers attitude towards open access
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:57
researchers attitudes towards types of open licenses
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:58
share of institutions having a repository
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:00
countries having scholarly open access rights in their copyright law
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:02
Number of fully open access and openly licensed scholarly books
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:51
Number of registered and open peer reviews (publons)
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:52
Number of institutions with an established open science community
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:58
Number of institutions having signed DORA
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 23:01
Number of published having signed DORA
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 23:01
number of funders having signed DORA
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 23:02
share of institutions having a strong repository mandate (roarmap)
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 23:02
Number of journals flipped to full OA
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 23:04

Funders' policies

Number of Funders with open access policies * (Source: Sherpa Juliet)
A number is a very coarse proxy here. It should include additional aspects such as APC caps, preference for green or gold, infrastructure support, licensing concerns, embargo concerns, and more.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 18:46
This data should always be used with the caveat that it is likely skewed towards UK/European/Western organisations based on its (limited) funding from Jisc - for example, Japan is listed as having no policies although at least two major funders have such a policy: http://openscience.jp/oa/oa-po...
highlight this | hide for print
Tony Ross-Hellauer
Tony Ross-Hellauer
- 15 Jul 2018 17:02
Any additional indicator (and source)?
Number of funder policies available and linked in Wikidata?
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:14
add open data policies, add reagent policies (should newly created material be shared with the community?) also record what implementation procedures are in place (data management plan required, exception allowed, own repository created,what happens if policy not followed...) and give more details (is green OA enough or is gold OA necessary,...)
highlight this | hide for print
J. Colomb, @pen
J. Colomb, @pen
- 02 Jul 2018 12:49
Seeing as this is an open science monitor, why not include the different aspects of open science too? These could be those such as open data, or more inclusive open science ones too.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 18:45
number of open access available proposals and decision letters
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:40
Funders with data and code sharing requirements
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:41
Funders with diversity statements/statistics regarding their review panels
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:42

Journals' policies

Number of Journals with open access policies * (Source: Sherpa Romeo)
Add: number of publishers which comply with the Creative Commons guidelines for marking Creative Commons works (a rel="license" links etc.). Possible source: https://docs.google.com/spread... or search.creativecommons.org.
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:41
Any additional indicator (and source)?
Number or percentage of journals with a BOAI-compatible, free and open license (CC-BY or CC-BY-SA). Source: DOAJ. https://doaj.org/search?source...
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:04
Number of journals with open peer review (possibly filtered by DOAJ seal to ensure data quality). Source: DOAJ. https://doaj.org/search?source...
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:05
Number of open access journals hosted and fully owned by universities or university consortia or other academic-controlled non-profit entities. Source: DOAJ with manual filtering, or a new tag.
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:09
Percentage of open citations (deposited by publishers on CrossRef). Source: I4OC. https://i4oc.org/#progress
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:35
I think that citations to open access research within patents would be interesting to capture. Could use lens.org.
highlight this | hide for print
Ashley Farley
Ashley Farley
- 19 Jun 2018 04:41
I want an journal metric that shows the take the DORA serious, and do not actually use the false JIF metric.
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:16
Add open data policies Add policies about material and method section
highlight this | hide for print
J. Colomb, @pen
J. Colomb, @pen
- 02 Jul 2018 12:51
Cost of OA publishing per journal/publisher.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 12:57
I think APCs should be included and monitored. At the same time, also pre-print and post-print policies need to be considered. A quantitative indicator to be included can be the embargo period.
highlight this | hide for print
Alessandro Sarretta
Alessandro Sarretta
- 04 Jul 2018 01:34
Number of journals that allow preprints, and postprints, and what their embargo periods are. Suggest also looking at the OAS evaluation tool: https://www.tandfonline.com/do... The fact that the Avoin Tiede report for measuring the openness of journals and publishers has not been included here, nor any of the evaluation criteria within, is deeply concerning. https://www.tandfonline.com/do...
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 18:57
Other things: Journals that accept articles based on soundness only; journals that use registered reports; journals that encourage reproducibility studies; journals that encourage publication of 'negative results'. These are all key aspects of open science that are missing.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 18:58
Furthermore, journals that still advertise impact factors in some form should be included in this.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 19:03
Publishers' policy. You might include here also book publishers.
highlight this | hide for print
Elena Giglia
Elena Giglia
- 30 Aug 2018 13:36
You might consider also adding publishers' policies, in order to also include book publishers.
highlight this | hide for print
OPERAS
OPERAS
- 31 Aug 2018 08:54
openly licensed open access of back volumes back to vol 1 issue 1
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:46
percentage of papers in the full publication history of the journal that are OA
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:47
journals with policies preventing re-enclosure after selling the title to another publisher
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:48
Number of diamond/platina journals and papers therein
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:04

Open research data

Number of repositories

Number of open data repositories* (Source: Re3data)
Is more better?
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:52
Any additional indicator (and source)?
Number and/or total size of CC-0 datasets. Source: BASE. https://www.base-search.net/Se...
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:16
Number of OAI-compliant repositories. Source: BASE content sources. https://www.base-search.net/ab...
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:59
Number of repositories with an open data (https://opendefinition.org/ ) policy for metadata. Source: OpenDOAR, "commercial" in metadata reuse policy. https://opendefinition.org/
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 18:01
Number of datasets issues with a DOI and where they cited. I think trying to capture how/where data is reused is of more impactful then just monitoring the number of repositories.
highlight this | hide for print
Ashley Farley
Ashley Farley
- 19 Jun 2018 05:02
1) Number of downloads from data repositories 2) Number of CC0 policies associated to datasets
highlight this | hide for print
Elena Giglia
Elena Giglia
- 30 Aug 2018 12:49
Number of papers on reporting research reusing existing data sets
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:53
Number of papers based on openly available (raw) data
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:53
Number/share of papers having an statement on data availability
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:54
Additions of new data/datasets to data repositories, by field
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:54
Openly avialable datasets, by license
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:55

Funders' policies

Number of Funders with policies on data sharing* (Source: Sherpa Juliet)
As above with OA, This data should always be used with the caveat that it is likely skewed towards UK/European/Western organisations based on its (limited) funding from Jisc - for example, Japan's JST has an open data policy not listed in JULIET: http://openscience.jp/oa/oa-po...
highlight this | hide for print
Tony Ross-Hellauer
Tony Ross-Hellauer
- 15 Jul 2018 17:04
Any additional indicator (and source)?
Number of funders which publish a transparency report recording all the APC fees paid.
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 18:03
Number of funders that share grant information as Open Data.
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:17

Journals' policies

Number of Journals with policies on data sharing* (Source: Vasilevsky et al, 2017)
additional indicator: number of publishers/journals that have adopted the TOP Guidelines (including the level of adoption actual implementation where possible) Source: https://cos.io/our-services/to...
highlight this | hide for print
Bianca Kramer
Bianca Kramer
- 30 May 2018 14:45
Any additional indicator (and source)?
Additional metrics: same as above, but also for: 1. software sharing, 2. method sharin
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:18
Additional metric: Number of journals that deposit supplementary in Open repositories for long term archival (e.g. Elsevier doesn't guarantee long term availability of supplementary information)
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:18
Number of Software Journals (e.g. JORS https://openresearchsoftware.m... )
highlight this | hide for print
Alessandro Sarretta
Alessandro Sarretta
- 04 Jul 2018 01:45
Number of journal to implement open peer-review.
highlight this | hide for print
Dⓐsapta Erwin Irawan
Dⓐsapta Erwin Irawan
- 10 Jul 2018 15:08
Number of journals by TOP guidelines variables and level
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:05
Number of data citations
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:13

Researchers' attitude towards data sharing

% of paper published with data (Source: Bibliometrics: Datacite)
Citations of data journals (Source: Bibliometrics: Datacite)
Attitude of researchers on data sharing* (Source: Survey by Elsevier, follow-up of the 2017 - Report: https://www.elsevier.com/about/open-science/research-data/open-data-report)
Why not ask directly in Twitter and with surveys via official EU channels or programs like the Marie Curie fellows?
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:56
I agree with Lluis's comment. This type of survey should be conducted either by the official EU channels or by a neutral 3rd party. Having Elsevier manage this survey could lead to too much bias. As well as survey respondents being unwilling to participate or provide Elsevier with their information. Using the same question issued through multiple channels would be ideal.
highlight this | hide for print
Ashley Farley
Ashley Farley
- 18 Jun 2018 23:27
figshare is also producing annual surveys
highlight this | hide for print
J. Colomb, @pen
J. Colomb, @pen
- 02 Jul 2018 12:52
Do not limit surveys to those organised by entities having clear conflict of interest on this topic.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:00
Elsevier has a strong negative image in connection with open science in the research community. Researchers may be unwilling to participate.
highlight this | hide for print
Marc Schoenwiesner
Marc Schoenwiesner
- 03 Jul 2018 10:29
If I read this correctly, this indicator is intended to be a follow-up to the previous Elsevier-led survey. Can the consortium assure us that Elsevier will have no part in the design, dissemination, data-collection, or analysis of this round of the survey. If not, it would contradict Paul Hofheinz’ recent public statement that: “Elsevier ... is a subcontractor to the project, having agreed to provide data. For the record, Elsevier has no involvement in defining, building or controlling any of the indicators that make up the Open Science Monitor, which the consortium is contracted to produc
highlight this | hide for print
Tony Ross-Hellauer
Tony Ross-Hellauer
- 15 Jul 2018 16:04
get rid of Elsevier ! There data are fake : - "it is critical to understand that the Journal Impact Factor has a number of well-documented deficiencies as a tool for research assessment" SanFrancisco Declaration - ‘Predatory’ open access: a longitudinal study of article volumes and market characteristics Cenyu Shen and Bo-Christer Björk BMC Medicine201513:230 DOI: 10.1186/s12916-015-0469-2 - https://openarchiv.hypotheses.... - https://openarchiv.hypotheses....
highlight this | hide for print
odilehennaut
odilehennaut
- 18 Jul 2018 16:01
Carry out your own independent survey or at least perform a systematic review of all survey research on researcher data sharing views
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:15
Any additional indicator (and source)?
Number of authors (e.g. identified by ORCID) who have published or deposited some work under a free/open license. Source: BASE, iterate over results to list the authors. https://www.base-search.net/Se...
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:33
Number of journal declarations of independence. Possible source: OAD. http://oad.simmons.edu/oadwiki...
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 18:08
%age of data coming with any metadata. %age of computer readable data
highlight this | hide for print
J. Colomb, @pen
J. Colomb, @pen
- 02 Jul 2018 12:53
Number of datasets deposited in scientific data repositories (from Re3data?). Number of papers on Data Journals (e.g. Earth System Science Data, Scientific Data, ...) Type of licence applied to data: CC0, CC-BY, ...
highlight this | hide for print
Alessandro Sarretta
Alessandro Sarretta
- 04 Jul 2018 01:38
The availability of RDMP document, to enforce data sharing and reuse.
highlight this | hide for print
Dⓐsapta Erwin Irawan
Dⓐsapta Erwin Irawan
- 10 Jul 2018 15:06
number of data papers
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:06

Open collaboration

Open Code

Number of code projects with DOI (Source: Mozilla Codemeta)
Software citations in DataCite
highlight this | hide for print
Bianca Kramer
Bianca Kramer
- 30 May 2018 14:46
Number of code projects in Zenodo
highlight this | hide for print
Rebecca Lawrence
Rebecca Lawrence
- 11 Jun 2018 17:13
OK, but there is more here too. What about projects which have a README file, a Code of Conduct, a how to collaborate file, and an explicit license. Also, you could measure the number of pull requests, issues, commits, and contributors to projects.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 19:00
Number of scientific API* (Source: Programmableweb)
% of journals with open code policy* (Source: Stodden 2013)
additional indicator: number of publishers/journals that have adopted the TOP Guidelines (including the level of adoption actual implementation where possible) Source: https://cos.io/our-services/to...
highlight this | hide for print
Bianca Kramer
Bianca Kramer
- 30 May 2018 14:46
Number of scientific projects on Github (Source: Github)
Number of GitHub projects archived on Zenodo
highlight this | hide for print
Bianca Kramer
Bianca Kramer
- 30 May 2018 14:46
Remove GitHub unless it publishes a data dump with said repositories. (For instance on Software Heritage.)
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:22
Add: number of software deposits under an OSI-approved license. Source: BASE. https://www.base-search.net/Se...
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:23
And GitLab, SourceForge, BitBucket... those may not be excluded.
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:19
Question: how can you select/filter "scientific" project from other types of project?
highlight this | hide for print
Alessandro Sarretta
Alessandro Sarretta
- 04 Jul 2018 01:40
This definitely needs to be extended to other repositories. The once mentioned by Egon Willighagen above as a start, but many more exist. The Journal of Open Research Software has an extensive list: https://openresearchsoftware.m... Again, a case were the proposed methodology would under report already existing efforts towards OA.
highlight this | hide for print
Konstantin Stadler
Konstantin Stadler
- 25 Jul 2018 12:16
Any additional indicator (and source)?
There are other source of storing programs Gitlab, SourceForge, bitbucket among those that use git for version control
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 12:51
Not sure what category this would fall under but I would love to see open protocols & materials tracked as well - protocols.io & AddGene are great resources
highlight this | hide for print
Ashley Farley
Ashley Farley
- 19 Jun 2018 05:14
Number of Software papers in Software Journals (e.g. JORS https://openresearchsoftware.m... and others)
highlight this | hide for print
Alessandro Sarretta
Alessandro Sarretta
- 04 Jul 2018 01:42
+1 for the comment by Alessandro Sarretta. Worrisome that these Journals are not listed in Scopus... Also, releases in zenodo should be counted
highlight this | hide for print
Konstantin Stadler
Konstantin Stadler
- 25 Jul 2018 12:17
Number of software citations
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:12

Open scientific hardware

Number of projects on open hardware repository* (Source: Open Hardware repository)
Number of projects using open hardware license* (Source: Open Hardware repository)
Would be great to see if there is an opportunity to collaborate with CodeOcean
highlight this | hide for print
Ashley Farley
Ashley Farley
- 19 Jun 2018 05:21
Note that on the European scene, every country may have their own version of such a license. The first work would be to list these across EU countries.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:02
Any additional indicator (and source)?
related: number of articles citing RRIDs
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:20

Citizen science

N. Projects in Zooniverse and Scistarter* (Source: Zooniverse and Scistarter)
N. Participants in Zooniverse and Scistarter (Source: Zooniverse and Scistarter)
Any additional indicator (and source)?
Number of DOIs cited from Wikipedia and Wikimedia projects. Source: data dumps + mwcites or https://doi.org/10.6084/m9.fig...
highlight this | hide for print
Federico Leva
Federico Leva
- 17 Jun 2018 17:30
As I don't see this anywhere else, what about scientific communication and outreach initiatives? This public dimension of open science is almost totally lacking from the indicators. Along these lines, journals which encourage commenting, and the addition of non-specialist summaries, should be included.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 19:02
I think the uptake of scientific results in Wikipedia is an important aspect not yet accounted for.
highlight this | hide for print
Konstantin Stadler
Konstantin Stadler
- 25 Jul 2018 12:20
Stakeholders engaged in setting up participatory innovation projects (like citizen science projects) should ensure the integration of sex/gender analysis in order to guarantee that innovative processes benefit all segments of population without bias.
highlight this | hide for print
Marina Angelaki
Marina Angelaki
- 27 Aug 2018 13:19
Track the scientific blogosphere, there might be blogs or projects addressing societal needs and involving societal actors/citizens. Again, SSH disciplines seem to be neglected.
highlight this | hide for print
OPERAS
OPERAS
- 31 Aug 2018 08:55
members of national and international citizen science organizations
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:27
mentions of citizen science in newspapers
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:27
mentions of citizen science in full text of articles
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:28

Altmetrics

P(tracked) - # Scopus publications that can be tracked by the different sources (e.g. typically only publications with a DOI, PMID, Scopus id, etc. can be tracked). (Source: Scopus & Plum Analytics)
Why don't use https://www.altmetric.com/ company?
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:49
As in the treatment of OA publications via Scopus, the use of Scopus, Mendeley and Plum Analytics immediately places a specific bias on the analysis of research's alternative impact. Such bias is anti-thetical to an "open science" monitoring system. Is there not an opporuntity for something more holistic? Or even for, say, alternative metrics to be delivered by, say, AltMetric? Note again that by using Scopus as the benchmark for published works, the underlying dataset will also be skewed towards the indexing coverage of Scopus.
highlight this | hide for print
George Macgregor
George Macgregor
- 18 Jun 2018 15:47
Exclude Scopus and Plum Analytics as there is clear conflic of interest.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:04
I would recommend looking at this article that compares Plum Analytics, Altmetric and Crossref Event Data https://osf.io/938wc/ A combination of all three would give a better picture.
highlight this | hide for print
Stephanie Dawson
Stephanie Dawson
- 06 Jul 2018 18:25
I see a conflict of interest in only relying on Scopus, Mendeley and Plum Analytics for this.
highlight this | hide for print
Stephanie Dawson
Stephanie Dawson
- 06 Jul 2018 18:26
What about Researchfish? https://www.researchfish.net/
highlight this | hide for print
Stephanie Dawson
Stephanie Dawson
- 06 Jul 2018 18:27
ImpactStory is also completely lacking from this.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 19:03
Please dont rely on one data source, combine Altmetric, Plum, ImpactStory.
highlight this | hide for print
Konstantin Stadler
Konstantin Stadler
- 25 Jul 2018 12:19
I agree with what is mentioned in the comments above, regarding diverse data sources (and would also like to see the principle of open sources and methods in use). I would also like to challenge the use of altmetrics as one concept used as indicators below Open collaboration. It is not clarified what is meant by Open collaboration (btw researchers, with larger community?) in the draft methodological note so it is hard to tell which indicator would be suitable: Mendeley could be a choice for collaboration btw researchers, preferably alongside other datasources so not to rely on only one.
highlight this | hide for print
Camilla Lindelöw
Camilla Lindelöw
- 31 Aug 2018 10:47
P(mendeley) - # Scopus publications with readership activity in Mendeley (Source: Scopus, Mendeley & Plum Analytics)
There are other databases of papers one reads/stores, like Zotero...
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:50
Yes, other platforms must be included, such as Zotero, ResearchGate, ScienceOpen, etc, etc, etc
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:22
Exclude Scopus, Mendeley and Plum Analytics as there is clear conflic of interest.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:04
Mendeley is not related to Altmetrics or Open Science. This needs to be scrapped.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:08
This actively discriminates against researchers who do not use Mendeley, and therefore should be removed as any form of reliable indicator. Indeed, rather than having us question all of these indicators, where is the rationale and justification for any of them?
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 19:04
The use of Mendeley and Plumanalytics owned by Elsevier such as Scopus is maybe not compatible with EU competition laws. The exclusive use of Scopus, Mendeley and Plumanalytics for OA monitoring may be a violation of competition EU laws by abuse of market dominance, even officially the metrics would be officially performed by CWTS. This configuration should be checked by the EU competition commission. Alternatives are use of Altmetrics, Zotero, Papers, Endnote, Reference Manager, BibTek, etc.
highlight this | hide for print
Sylvie Vullioud
Sylvie Vullioud
- 28 Jul 2018 16:10
Endogamic indicator! You use an Elsevier owned database compared to an Elsevier owned social network, which by the way most users left the day after Elsevier acquisition. All these Scopus/Mendely indicators are useless as they are too narrow
highlight this | hide for print
Elena Giglia
Elena Giglia
- 30 Aug 2018 12:54
PP(mendeley) - Proportion of publications covered on Mendeley. P(mendeley)/P(tracked) (Source: Scopus, Mendeley & Plum Analytics)
And the other publications not covered by Mendely and covered by other services? For example sci-hub?
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:50
And the other publications not covered by Mendely and covered by other services? For example sci-hub?
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:50
Or what about looking at the metrics that the specific journal themselves uses?
highlight this | hide for print
Ashley Farley
Ashley Farley
- 19 Jun 2018 05:11
Using just Mendeley is absolutely misleading. many researchers left it the day after Elsevier acquisition There are lots of other academic social networks to be taken into account.
highlight this | hide for print
Elena Giglia
Elena Giglia
- 30 Aug 2018 13:44
The use of Mendeley is misleading as it is not the only academic social network. Moreover, coupling it with Scopus results in a sort of "endogamic" indicator referring just to the Elsevier sphere.
highlight this | hide for print
OPERAS
OPERAS
- 31 Aug 2018 08:56
AFAIK mendeley "reads" include instances of downloads of bibliographic data only and thus are a very questionable indicator
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:32
TRS - Total Readership Score of Scopus publications. Sum of all Mendeley readership received by all P(tracked) (Source: Scopus, Mendeley & Plum Analytics)
How is this related to open science?
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:50
This is not related to Open Science. Remove this metric.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:09
TRS(academics) - Total Readership Score of Scopus publications from Mendeley academic users (PhdS, Professors, Postdocs, researchers, etc.) (Source: Scopus, Mendeley & Plum Analytics)
Why not using the ORCID self-description instead of the Mendeley users database?
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:51
Because Elsevier are the sub-contractor for this.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 17 Jun 2018 12:52
Remove. This is uninformative and very volatile.
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:21
This is not related to Open Science. Remove this metric.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:09
TRS(students) - Total Readership Score of Scopus publications from Mendeley student users (Master and Bachelor students) (Source: Scopus, Mendeley & Plum Analytics)
Is the database from Scopus and Mendely really representative of this group?
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:52
Remove. This is extremely volatile.
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:21
This is not related to Open Science. Remove this metric.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:09
Mendeley readership data are highly questionable
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:34
TRS(professionals) - Total Readership Score of Scopus publications from Mendeley professional users (librarians, other professionals, etc.) (Source: Scopus, Mendeley & Plum Analytics)
Exclude Scopus, Mendeley and Plum Analytics as sources as there is obvious conflict of interest.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:10
MRS - Mean Readerships Score. TRS/P(tracked) (Source: Scopus & Plum Analytics)
I assume this will be based on Mendeley as the other readership scores. To divide them into subcategories may be misleading as we know that only a certain part of users state their career status and if they belong to academia or not. The users also introduce bias for discipline and career age that has to be expressed clearly (see for example: https://wlv.openrepository.com... ) Other sharing/reading source could complement Mendeley
highlight this | hide for print
Camilla Lindelöw
Camilla Lindelöw
- 31 Aug 2018 11:19
MRS(academics) - TRS(academics)/P(tracked) (Source: Scopus & Plum Analytics)
MRS(students) - TRS(students)/P(tracked) (Source: Scopus & Plum Analytics)
MRS(professionals) - TRS(professionals)/P(tracked) (Source: Scopus & Plum Analytics)
P(twitter) - # Scopus publications that have been mentioned in at least one (re)tweet (Source: Scopus & Plum Analytics)
Why look only for publications on Scopus tweeted ?? The ones that aren't in Scopus doesn't make them less open
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:53
Use multiple resources, not only Plum.
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:23
Exclude Scopus and Plum Analytics from sources as there is obvious conflict of interest.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:11
How does this metric capture anything informative, besides that an article has been tweeted? How does it differentiate between 1 and 1000 retweets, and the additional discussion that this might have catalysed.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 19:05
In complete agreement with Jon (and also Etienne, but that's a different matter). Twitter is antithetical to science writing/reading.
highlight this | hide for print
David Morris
David Morris
- 02 Aug 2018 16:42
I agree with the comments above, and would also like to add that it has to be stated how twitter indicators point to open collaboration (the title of this section)? One finding is that publications tweeted are generally in categories where we don't count citations (news, editorials) and as such maybe could be used for an alternative attention indicator, but not necessarily a collaboration indicator (https://www.altmetric.com/blog... )
highlight this | hide for print
Camilla Lindelöw
Camilla Lindelöw
- 31 Aug 2018 11:30
PP(twitter) - Proportion of publications mentioned on Twitter. P(twitter)/P(tracked) (Source: Scopus & Plum Analytics)
The services provided by https://www.altmetric.com/ is another alternative score
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:52
Exclude Scopus and Plum Analytics from sources as there is obvious conflict of interest.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:11
TTWS - Total Twitter Score. Sum of all tweets mentions received by all P(tracked) (Source: Scopus & Plum Analytics)
The services provided by https://www.altmetric.com/ is another alternative score
highlight this | hide for print
Lluís Revilla
Lluís Revilla
- 15 Jun 2018 14:52
Exclude Scopus and Plum Analytics from sources as there is obvious conflict of interest.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:11
MTWS - Mean Twitter Score. TTWS/P(tracked) (Source: Scopus & Plum Analytics)
Exclude Scopus and Plum Analytics from sources as there is obvious conflict of interest.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:11
Where are the other social media platforms? Where are ResearchGate, Academia.edu, Humanities commons, Google Plus, LinkedIn, Facebook, Reddit, as well as those which are used more by non-European communities?
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 19:09

Propose new trends

Please add here trends not currently captured above, and suggest possible indicators and sources for them.
Only include indicators that are themselves open, so data can be reused and results can be reproduced
highlight this | hide for print
Bianca Kramer
Bianca Kramer
- 30 May 2018 14:47
e.g.: preregistrations (OSF, clinicaltrials.gov, aspredicted.org); journals accepting preregistered reports; journals practicing open peer review (names published, reports published); institutions with open science aspects included in T&P policies; institutions, funders and publishers that are DORA signatories; publishers/journals that do not advertise impact factor on journal or article webpages; journals accepting preprints; preprints published (Crossref, DataCite (both with caveats) journals being transparant about breakdown of APC costs;
highlight this | hide for print
Bianca Kramer
Bianca Kramer
- 30 May 2018 16:02
Regarding suggestion above for tracking tenure and promotion policies: see corresponding project and data for US/Canada: https://www.scholcommlab.ca/20... (preliminary findings); , https://www.scholcommlab.ca/20... (data, CC-0)
highlight this | hide for print
Bianca Kramer
Bianca Kramer
- 30 May 2018 16:57
What is absolutely clear here, and needs to be dealt with, is the enormous conflict of interest apparent. As has been noted multiple times above, it is bizarre for so many of these trends and criteria to be solely focused on Elsevier-owned services, such as Scopus and Mendeley. There are numerous reasons why this is strange. Firstly, it comes at a time when many EU member states are in conflict with Elsevier over its business strategies. Secondly, it creates a dependency on these commercial services, as well as an information bias. (1/2)
highlight this | hide for print
Jon Tennant
Jon Tennant
- 17 Jun 2018 12:56
(2/2) Thirdly, Elsevier have a commercial interest in promoting its own services over those of its competitors. As evidenced above, this is a clear conflict of interest, and must be treated as such. If researchers become locked into, or dependent on, this process, then Elsevier has more control over how the research process occurs, how content is produced and distributed, and how those are assessed. The best approach to resolve this would be to remove Elsevier as sub-contractor, and reform with an independent expert group to provide the same service without the inherent COI.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 17 Jun 2018 13:00
I agree with all the comments so far. I therefore won't restate them, other that to say that my responses to previous aspects of the methdological note above provide practical examples of how there are philosophical and practical issues to using a restrictive sources as the basis for monitoring. It is an odd contradiction: monitor open science using sources that, in part, the open science movement grew in a reaction to.
highlight this | hide for print
George Macgregor
George Macgregor
- 18 Jun 2018 16:10
I agree with all of the above comments. This is an opportunity to untangle many of the issues existing in the current research ecosystem. Do not pass up this opportunity without careful consideration. This is worth doing well - even if it's difficult.
highlight this | hide for print
Ashley Farley
Ashley Farley
- 19 Jun 2018 05:09
Including for profit corporations that have a financial incentive to promote their own brands, marginalise open access players and control the narrative around Open Access is counter intuitive and morally bankrupt. Elsevier should be removed as a subcontractor and all brands Elsevier controls should be removed.
highlight this | hide for print
Kade Morton
Kade Morton
- 24 Jun 2018 03:20
No mention of citation of articles in Wikipedia or core research databases. Must be added.
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 01 Jul 2018 11:24
Thank you very much to open this step of your work to our comments. I hope you will continue to follow this track and make your methodology open and pre-registered. Please, make sure that you are not creating incentives to use a particular service by monitoring only part of the spectrum (this also includes not to rely too much on twitter...), and prefer open source initiatives over closed products.
highlight this | hide for print
J. Colomb, @pen
J. Colomb, @pen
- 02 Jul 2018 13:01
It is obvious that Elsevier has a clear conflict of interest in their role as subcontractor for OSM, and they should be removed from that position. At minima, they should be barred from using their own entities as sources. In its current state, there is no guarantee of objectivity whatsoever.
highlight this | hide for print
Etienne Gaudrain
Etienne Gaudrain
- 02 Jul 2018 13:16
Metrics are missing to measure to uptake of Open Science in industry. They must be added.
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 02 Jul 2018 20:17
another metric I like to see added: percentages of EC grant proposals, for accepted and for rejected, published #openaccess in a journal like @RIOJournal
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 03 Jul 2018 06:42
To echo the comments above, closed databases, and metrics derived from closed databases, are bad for open science. Relying on commercial services who have a COI is also bad for open science.
highlight this | hide for print
Justin Salamon
Justin Salamon
- 03 Jul 2018 17:36
Using closed databases to track open research is rather ironic. More importantly, it is unnecessary and harmful. Please do not use Elsevier as a subcontractor.
highlight this | hide for print
Julian Holstein
Julian Holstein
- 03 Jul 2018 19:30
I absolutely agree with the previous comments. Open databases are key and subcontracting Elsevier who has been opposing OA to monitor OA is counterintuitive to say the least. Thank you for the open process.
highlight this | hide for print
Niki Vavatzanidis
Niki Vavatzanidis
- 04 Jul 2018 11:31
Data quality and consistency between data providers must be evaluated. Example work of this kind: https://osf.io/938wc/
highlight this | hide for print
Egon Willighⓐgen
Egon Willighⓐgen
- 06 Jul 2018 10:48
For those who might not be aware, please note that there has been a quite vigorous debate about the role of Elsevier here. All relevant links can be found in the latest here: http://fossilsandshit.com/resp... (excuse the URL name..)
highlight this | hide for print
Jon Tennant
Jon Tennant
- 06 Jul 2018 19:10
There is no mention about monitoring equality and diversity to ensure that these ongoing research and impact assessment criteria do not negatively affect any particular communities or demographics.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 10 Jul 2018 13:26
How will these indicators reflect disciplinary differences? For example, most of them are to do with citations, articles, and data, and therefore highly geared towards STEM subjects, to the exclusion of the social science, humanities, and arts.
highlight this | hide for print
Jon Tennant
Jon Tennant
- 10 Jul 2018 13:30
Evaluating a policy is difficult and the EU is looking for a single body to manage that. Hence the choices for a more or less general body are either Scopus (european), Clarivate analytics (american), of Scholar google (american). Add a touch of lobby from Elsevier (and probably Springer) and you get the result. But the problem is that the result will be completely biased, proprietary and possibly subject to a lot of manipulations, as mentioned in the other comments. The real solution will be to have a specific, independent EU body for the follow-up that will process all the sources mentioned
highlight this | hide for print
Jean-Pierre Merlet
Jean-Pierre Merlet
- 13 Jul 2018 11:46
For the future: Please provide an ORCID login - not everyone in on Twitter and Facebook
highlight this | hide for print
Konstantin Stadler
Konstantin Stadler
- 25 Jul 2018 12:35
Awareness must be raised in the OS/OI policy and research community on the relevance of gender and ways OS/OI can mitigate against gender inequality and bias in the various aspects of OS/OI.The Genderaction project has studied the links between OS/OI and gender and a report will be soon publicly available. Key findings have been published in the form of a policy brief http://genderaction.eu/wp-cont...
highlight this | hide for print
Marina Angelaki
Marina Angelaki
- 27 Aug 2018 13:18
The European Commission and its OSPP Expert Group, along with other stakeholders involved in research assessment (RFOs, RPOs) are encouraged to explore how/if the use of new metrics impacts men and women researchers at different career stages and disciplines differently. The choice of metrics during research assessment/evaluation procedures has important implications on researchers’ priorities and strategies, their choice of publication venue and the way they present their CVs.
highlight this | hide for print
Marina Angelaki
Marina Angelaki
- 27 Aug 2018 13:21
Open Peer Review: As new forms of peer review (like OPR) are being introduced it is important to examine how different practices impact on gender bias and explore ways of adequately addressing them.
highlight this | hide for print
Marina Angelaki
Marina Angelaki
- 27 Aug 2018 13:24
1) preprints, preregistration practices,open peer review, openlabnotebooks, protocols and openly shared workflows are completely absent from this framework but they are the core of Open Science practices 2) you need to include SSH disciplines in this picture 3) scientific blogging is also absent (see e.g. Hypotheses.org) 4) alternative research assessment adopted criteria or best practices, as collected by DORA
highlight this | hide for print
Elena Giglia
Elena Giglia
- 30 Aug 2018 13:00
The picture coming out is highly biased towards STEM diciplines, English-speaking resources, and limited to only journals. To include the SSH: 1) Create a new set of indicators specifically referring to monographs, using e.g. DOAB as a source. 2) Include specific SSH-oriented tools like Isidore, which address the multilingualism issue. 3) Track SSH-oriented projects developed by open infrastructures like DARIAH or CLARIN. 4) Include scientific blogs to track the open debate in the SSH, using e.g. Hypotheses.org.
highlight this | hide for print
OPERAS
OPERAS
- 31 Aug 2018 09:00
I agree with many of the comments above (Bianca Kramer and Jon Tennant) as well as others. Main point is: how can EC / EU create / use an open science monitor basing itself on one source (Elsevier). Elsevier does NOT represent openness.
highlight this | hide for print
Lotta Svantesson
Lotta Svantesson
- 31 Aug 2018 14:53
We thank the commission for the initiative of the Open Science Monitor. This tool must be free of bias and trustable. The methodology raises several questions. We detail in a 5 pages letter our remarks and recommendations for future work. We suggest to build the next generation of open science monitor with a variety of open data sources. Bernard Larrouturou, Director-General for Research and Innovation and President of the French Open Science Committee. https://www.dropbox.com/s/pyi1...
highlight this | hide for print
Bernard Larrouturou
Bernard Larrouturou
- 31 Aug 2018 18:20
Institutions having open science in their mission statement
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:58
Institutions having open science in their researcher code of conduct
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 21:59
Number of journals supporting registered reports document type
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:08
Number of preregistrations
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:37
journals indicating contributor roles
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:39
research funding proposals in which open science is explicitly mentioned
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:40
academic job ads mentioning/asking for open science experience
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:41
jobs for data managers, data consultants and data stewards at universities
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:42
Number of papers published in journals that explicitly state to select papers just on rigour not on novelty
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:44
number of replication studies published
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:44
Number of papers with a plain language abstract
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:47
Number of papers with abstracts in 2 or more world languages
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:47
number of papers stating that only open source software is used
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:48
number of scientific conferences posters shared openly in e.g. Figshare, Zenodo etc.
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:50
Number of conference presentation slides shared openly
highlight this | hide for print
Jeroen Bosman
Jeroen Bosman
- 31 Aug 2018 22:50