General stats
Users
Comments map
Printable speech
Words cloud
Back to home
All speeches
Help us improving the indicators!
Last update 09 Oct 2018
64 paragraphs, 291 comments
Speech words cloud
To add a comment on each indicator or suggest new indicators, just click on the bubble alongside each line. Open access to publications Green and gold open access P - # Scopus publications that enter in the analysis* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire) P(oa) - # Scopus publications that are Open Access (CWTS method for OA identification)* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire) P(green oa) - # Scopus publications that are Green OA* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire) P(gold oa) - # Scopus publications that are Gold OA* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire) PP(oa) - Percentage OA publications of total publications* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire) PP(green oa) - Percentage gold OA publications of total publications* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire) PP(gold oa) - Percentage green OA publications of total publications* (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire) TCS - Total Citation Score. Sum of all citations received by P in Scopus. (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire) FWCI – Field Weighted Citation Score. (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire) TP1/TP10 - Top1/Top10 percentile highly cited publications (Source: Scopus, DOAJ, ROAD, PubMedCentral, CrossRef, OpenAire) Any additional indicator (and source)? Funders' policies Number of Funders with open access policies * (Source: Sherpa Juliet) Any additional indicator (and source)? Journals' policies Number of Journals with open access policies * (Source: Sherpa Romeo) Any additional indicator (and source)? Open research data Number of repositories Number of open data repositories* (Source: Re3data) Any additional indicator (and source)? Funders' policies Number of Funders with policies on data sharing* (Source: Sherpa Juliet) Any additional indicator (and source)? Journals' policies Number of Journals with policies on data sharing* (Source: Vasilevsky et al, 2017) Any additional indicator (and source)? Researchers' attitude towards data sharing % of paper published with data (Source: Bibliometrics: Datacite) Citations of data journals (Source: Bibliometrics: Datacite) Attitude of researchers on data sharing* (Source: Survey by Elsevier, follow-up of the 2017 - Report: https://www.elsevier.com/about/open-science/research-data/open-data-report) Any additional indicator (and source)? Open collaboration Open Code Number of code projects with DOI (Source: Mozilla Codemeta) Number of scientific API* (Source: Programmableweb) % of journals with open code policy* (Source: Stodden 2013) Number of scientific projects on Github (Source: Github) Any additional indicator (and source)? Open scientific hardware Number of projects on open hardware repository* (Source: Open Hardware repository) Number of projects using open hardware license* (Source: Open Hardware repository) Any additional indicator (and source)? Citizen science N. Projects in Zooniverse and Scistarter* (Source: Zooniverse and Scistarter) N. Participants in Zooniverse and Scistarter (Source: Zooniverse and Scistarter) Any additional indicator (and source)? Altmetrics P(tracked) - # Scopus publications that can be tracked by the different sources (e.g. typically only publications with a DOI, PMID, Scopus id, etc. can be tracked). (Source: Scopus & Plum Analytics) P(mendeley) - # Scopus publications with readership activity in Mendeley (Source: Scopus, Mendeley & Plum Analytics) PP(mendeley) - Proportion of publications covered on Mendeley. P(mendeley)/P(tracked) (Source: Scopus, Mendeley & Plum Analytics) TRS - Total Readership Score of Scopus publications. Sum of all Mendeley readership received by all P(tracked) (Source: Scopus, Mendeley & Plum Analytics) TRS(academics) - Total Readership Score of Scopus publications from Mendeley academic users (PhdS, Professors, Postdocs, researchers, etc.) (Source: Scopus, Mendeley & Plum Analytics) TRS(students) - Total Readership Score of Scopus publications from Mendeley student users (Master and Bachelor students) (Source: Scopus, Mendeley & Plum Analytics) TRS(professionals) - Total Readership Score of Scopus publications from Mendeley professional users (librarians, other professionals, etc.) (Source: Scopus, Mendeley & Plum Analytics) MRS - Mean Readerships Score. TRS/P(tracked) (Source: Scopus & Plum Analytics) MRS(academics) - TRS(academics)/P(tracked) (Source: Scopus & Plum Analytics) MRS(students) - TRS(students)/P(tracked) (Source: Scopus & Plum Analytics) MRS(professionals) - TRS(professionals)/P(tracked) (Source: Scopus & Plum Analytics) P(twitter) - # Scopus publications that have been mentioned in at least one (re)tweet (Source: Scopus & Plum Analytics) PP(twitter) - Proportion of publications mentioned on Twitter. P(twitter)/P(tracked) (Source: Scopus & Plum Analytics) TTWS - Total Twitter Score. Sum of all tweets mentions received by all P(tracked) (Source: Scopus & Plum Analytics) MTWS - Mean Twitter Score. TTWS/P(tracked) (Source: Scopus & Plum Analytics) Propose new trends Please add here trends not currently captured above, and suggest possible indicators and sources for them.
Comments words cloud
additional indicator: number of publishers/journals that have adopted the TOP Guidelines (including the level of adoption actual implementation where possible) Source: https://cos.io/our-services/top-guidelines/Software citations in DataCiteadditional indicator: number of publishers/journals that have adopted the TOP Guidelines (including the level of adoption actual implementation where possible) Source: https://cos.io/our-services/top-guidelines/Number of GitHub projects archived on ZenodoOnly include indicators that are themselves open, so data can be reused and results can be reproducede.g.: preregistrations (OSF, clinicaltrials.gov, aspredicted.org); journals accepting preregistered reports; journals practicing open peer review (names published, reports published); institutions with open science aspects included in T&P policies; institutions, funders and publishers that are DORA signatories; publishers/journals that do not advertise impact factor on journal or article webpages; journals accepting preprints; preprints published (Crossref, DataCite (both with caveats) journals being transparant about breakdown of APC costs;Regarding suggestion above for tracking tenure and promotion policies: see corresponding project and data for US/Canada: https://www.scholcommlab.ca/2018/05/30/preliminary-findings-from-the-review-promotion-and-tenure-study/ (preliminary findings); , https://doi.org/10.7910/DVN/VY4TJE (data, CC-0)Number of code projects in ZenodoWhy restrict only for Scopus?Why only Open Access in Scopus and not all articles that are open access?Why restrict only for Scopus?Why restrict only for Scopus?What about other sources like Google Scholar?It could be "improved" via trying to tag articles via machine learning...Why not ask directly in Twitter and with surveys via official EU channels or programs like the Marie Curie fellows?There are other source of storing programs Gitlab, SourceForge, bitbucket among those that use git for version controlWhy don't use https://www.altmetric.com/ company?There are other databases of papers one reads/stores, like Zotero...And the other publications not covered by Mendely and covered by other services? For example sci-hub?And the other publications not covered by Mendely and covered by other services? For example sci-hub?How is this related to open science?Why not using the ORCID self-description instead of the Mendeley users database?Is the database from Scopus and Mendely really representative of this group?Why look only for publications on Scopus tweeted ?? The ones that aren't in Scopus doesn't make them less openThe services provided by https://www.altmetric.com/ is another alternative scoreThe services provided by https://www.altmetric.com/ is another alternative scoreIn addition to all of the above, it should be questioned why Web of Science is not being used for this process, especially now that it provides OA status after integrating with Unpaywall. WoS was also used for the first version of the monitor, so it is unclear why this switch was made, instead of using both services.Because Elsevier are the sub-contractor for this.What is the CWTS method for Open Access identification? According to the note, this is a combination of 5 different sources (DOAJ, PMC, ROAD, CrossRef, and OpenAIRE). As mentioned above, these data can be obtained from a single services, such as Web of Science and Unpaywall. AGain, it is unclear why Clarivate's services has been replaced with the more complex Elsevier one. Will these data, matching algorithms, and other methods be made available?Further to my comment above, I want to draw attention to a recent publication, in which two of the authors are from the CWTS: https://osf.io/preprints/socarxiv/k54uv/ Here, this study uses the same sources, with the exception that it uses Google Scholar (free) instead of Scopus (paid). Why is this not considered here, when members of the same research group are using the method, and clearly to the same effect? They are able to adequately and accurately assess publisher-based OA proportions for the different 'types', and delineate the data also based on discipline and country.If Scopus uses DOAJ and ROAD to assess the OA status of journals, should not these primary data sources be used instead? Unclear here what additional benefit using Scopus has. https://blog.scopus.com/posts/more-ways-to-discover-content-from-open-access-journals-in-scopusAlso Dimensions or Web of Science. Constraining this to one source, that just happens to be owned by the sub-contractor, is an exceptionally bad practice.I think, if anything, what high citations have to do with open science, which is about moving away from traditional evaluation metrics, should be justified here.How about a breakdown of the cost associated with meeting OA in all of the above indicators. This could include by country, by publisher, by journal etc.A number is a very coarse proxy here. It should include additional aspects such as APC caps, preference for green or gold, infrastructure support, licensing concerns, embargo concerns, and more.Seeing as this is an open science monitor, why not include the different aspects of open science too? These could be those such as open data, or more inclusive open science ones too.Number of journals that allow preprints, and postprints, and what their embargo periods are. Suggest also looking at the OAS evaluation tool: https://www.tandfonline.com/doi/pdf/10.1080/00987913.2016.1182672 The fact that the Avoin Tiede report for measuring the openness of journals and publishers has not been included here, nor any of the evaluation criteria within, is deeply concerning. https://avointiede.fi/documents/10864/12232/OPENING+ACADEMIC+PUBLISHING+.pdf/a4358f81-88cf-4915-92db-88335092c992Other things: Journals that accept articles based on soundness only; journals that use registered reports; journals that encourage reproducibility studies; journals that encourage publication of 'negative results'. These are all key aspects of open science that are missing.Furthermore, journals that still advertise impact factors in some form should be included in this.OK, but there is more here too. What about projects which have a README file, a Code of Conduct, a how to collaborate file, and an explicit license. Also, you could measure the number of pull requests, issues, commits, and contributors to projects.As I don't see this anywhere else, what about scientific communication and outreach initiatives? This public dimension of open science is almost totally lacking from the indicators. Along these lines, journals which encourage commenting, and the addition of non-specialist summaries, should be included.ImpactStory is also completely lacking from this.This actively discriminates against researchers who do not use Mendeley, and therefore should be removed as any form of reliable indicator. Indeed, rather than having us question all of these indicators, where is the rationale and justification for any of them?Because Elsevier are the sub-contractor for this.How does this metric capture anything informative, besides that an article has been tweeted? How does it differentiate between 1 and 1000 retweets, and the additional discussion that this might have catalysed.Where are the other social media platforms? Where are ResearchGate, Academia.edu, Humanities commons, Google Plus, LinkedIn, Facebook, Reddit, as well as those which are used more by non-European communities? What is absolutely clear here, and needs to be dealt with, is the enormous conflict of interest apparent. As has been noted multiple times above, it is bizarre for so many of these trends and criteria to be solely focused on Elsevier-owned services, such as Scopus and Mendeley. There are numerous reasons why this is strange. Firstly, it comes at a time when many EU member states are in conflict with Elsevier over its business strategies. Secondly, it creates a dependency on these commercial services, as well as an information bias. (1/2)(2/2) Thirdly, Elsevier have a commercial interest in promoting its own services over those of its competitors. As evidenced above, this is a clear conflict of interest, and must be treated as such. If researchers become locked into, or dependent on, this process, then Elsevier has more control over how the research process occurs, how content is produced and distributed, and how those are assessed. The best approach to resolve this would be to remove Elsevier as sub-contractor, and reform with an independent expert group to provide the same service without the inherent COI.For those who might not be aware, please note that there has been a quite vigorous debate about the role of Elsevier here. All relevant links can be found in the latest here: http://fossilsandshit.com/response-to-president-paul-hofheinz-of-the-lisbon-council-regarding-elsevier-and-the-open-science-monitor/ (excuse the URL name..)There is no mention about monitoring equality and diversity to ensure that these ongoing research and impact assessment criteria do not negatively affect any particular communities or demographics.How will these indicators reflect disciplinary differences? For example, most of them are to do with citations, articles, and data, and therefore highly geared towards STEM subjects, to the exclusion of the social science, humanities, and arts.It is not relevant to use a paid database to measure open science (or in this case open access articles), 1) Scopus only index journals based on registration and include the ones that fit with their criteria, many journals from global south (like Indonesia where all journals are OA) are not included, 2) Scopus dbase contains mostly English-based journals. Again in this case, very few journals from Global South are listed, 3) Scopus does not index repositories. 1/2Scientifc community can always go to Google Scholar, Base, Share database (and now Dimension) to get articles that are not listed in Scopus dbase. Promoting the usage of paid dbase to this project can lead to a more damage to publishing system in non US/EU countries, because they tend to follow the policy from western countries. 2/2Number of journal to implement open peer-review.The availability of RDMP document, to enforce data sharing and reuse.Replace all sources with Unpaywall/ImpactStory, which already combines all the necessary/good ones.Replace all sources with Unpaywall/ImpactStory, which already combines all the necessary/good ones.Replace all sources with Unpaywall/ImpactStory, which already combines all the necessary/good ones.Consider SemanticScholar, CiteSeerX and CORE as sources.Publisher-controlled sources should be replaced by CrossRef data, which is already collected from publishers. Only open citation data should be considered (see https://i4oc.org/ ).Number of open access works whose referencs are also all open access (idea credit: John Dove). Source: iterate over open citation data at CrossRef and merge with Unpaywall data.Number of academic entities publishing their subscription spending. Possible source: fund OpenAIRE to do it. Current sources: see e.g. http://stuartlawson.org/2016/06/publicly-available-data-on-international-journal-subscription-costs/Number of funder policies available and linked in Wikidata?Add: number of publishers which comply with the Creative Commons guidelines for marking Creative Commons works (a rel="license" links etc.). Possible source: https://docs.google.com/spreadsheets/d/10neMngtq9xE_5ZvZxwM3apDyvgYtS50X7KsZEMNf2ME/edit#gid=0 or search.creativecommons.org.Number or percentage of journals with a BOAI-compatible, free and open license (CC-BY or CC-BY-SA). Source: DOAJ. https://doaj.org/search?source=%7B%22query%22%3A%7B%22filtered%22%3A%7B%22filter%22%3A%7B%22bool%22%3A%7B%22must%22%3A%5B%7B%22term%22%3A%7B%22_type%22%3A%22journal%22%7D%7D%2C%7B%22term%22%3A%7B%22index.license.exact%22%3A%22CC%20BY%22%7D%7D%5D%7D%7D%2C%22query%22%3A%7B%22match_all%22%3A%7B%7D%7D%7D%7D%2C%22from%22%3A0%2C%22size%22%3A10%7DNumber of journals with open peer review (possibly filtered by DOAJ seal to ensure data quality). Source: DOAJ. https://doaj.org/search?source=%7B%22query%22%3A%7B%22filtered%22%3A%7B%22filter%22%3A%7B%22bool%22%3A%7B%22must%22%3A%5B%7B%22term%22%3A%7B%22_type%22%3A%22journal%22%7D%7D%2C%7B%22term%22%3A%7B%22bibjson.editorial_review.process.exact%22%3A%22Open%20peer%20review%22%7D%7D%2C%7B%22term%22%3A%7B%22index.has_seal.exact%22%3A%22Yes%22%7D%7D%5D%7D%7D%2C%22query%22%3A%7B%22match_all%22%3A%7B%7D%7D%7D%7D%2C%22from%22%3A0%2C%22size%22%3A10%7DNumber of open access journals hosted and fully owned by universities or university consortia or other academic-controlled non-profit entities. Source: DOAJ with manual filtering, or a new tag.Percentage of open citations (deposited by publishers on CrossRef). Source: I4OC. https://i4oc.org/#progressNumber and/or total size of CC-0 datasets. Source: BASE. https://www.base-search.net/Search/Results?lookfor=doctype%3A7+rights%3ACC0Number of OAI-compliant repositories. Source: BASE content sources. https://www.base-search.net/about/en/about_sources_date.phpNumber of repositories with an open data (https://opendefinition.org/ ) policy for metadata. Source: OpenDOAR, "commercial" in metadata reuse policy. http://opendoar.org/onechart.php?cID=&ctID=&rtID=&clID=&lID=&potID=2&rSoftWareName=&search=&groupby=pog.pogHeading&orderby=pog.pogID&charttype=pie&width=600&height=300&caption=Recorded%20Metadata%20Re-use%20Policies%20-%20WorldwideNumber of funders which publish a transparency report recording all the APC fees paid.Number of authors (e.g. identified by ORCID) who have published or deposited some work under a free/open license. Source: BASE, iterate over results to list the authors. https://www.base-search.net/Search/Results?type=all&lookfor=rights%3A%28CC-BY+CC-BY-SA+CC0+GPL+BSD+MIT%29+access%3A1Number of journal declarations of independence. Possible source: OAD. http://oad.simmons.edu/oadwiki/Journal_declarations_of_independenceRemove GitHub unless it publishes a data dump with said repositories. (For instance on Software Heritage.)Add: number of software deposits under an OSI-approved license. Source: BASE. https://www.base-search.net/Search/Results?type=all&lookfor=doctype%3A6+rights%3A%28GPL+BSD+MIT%29Number of DOIs cited from Wikipedia and Wikimedia projects. Source: data dumps + mwcites or https://doi.org/10.6084/m9.figshare.1299540I would agree with Dasapta's comments here. By focusing on a single indexing & abstracting tool Open Science Monitor is immediately excluding a large corpus of relevant data and content. No indexing & abstracting service indexes all known literature so a multple system approach is a necessity.Such approach may require the inclusion of DS Dimensions, 1findr (both of which are open) and WoS. Appreciate Google Scholar and MS Academic might be difficult to gather meaningful data from - but worth investigating? Whatever the final approach, something more holistic in scope would benefit OS Monitor, researchers and other users.It is good to observe important open science tools, such as DOAJ, ROAD, OpenAire, etc. being proposed here; but I worry that there is insufficient emphasis on Green OA. For example, only OpenAIRE will surface data about Green OA, yet OpenAIRE's coverage may be limited. Is probably necessary to seek inclusion of IRs or aggregations outside OpenAIRE infrastructure. I would actually refer anyone reading this to my previous comment, because it addresses this: "It is good to observe important open science tools, such as DOAJ, ROAD, OpenAire, etc. being proposed here; but I worry that there is insufficient emphasis on Green OA. For example, only OpenAIRE will surface data about Green OA, yet OpenAIRE's coverage may be limited. Is probably necessary to seek inclusion of IRs or aggregations outside OpenAIRE infrastructure." But, in addition, by using only Scopus a restrictive - and un-openscience publication ethos - will be imposed on Open Science Monitor. After all, Scopus is only a subset of published literature. An important ethos of open science is promoting the publication of research that might sit outside the scope of Scopus or WoS. The drive to publish in journals indexed by these sorts of services distorts healthy scholarly communication and is precisely why open science principles have been adopted by researchers. Therefore........Wouldn't Open Science Monitor simply be reinforcing the bias of Scopus within its open science monitoring system? As per some of the comments already, open approaches should be the preferred option here. Open Science Monitor has a unique opportunity to promote a completely different measure of scientific impact. By promoting open metrics the OS Monitor could support the move away from the "traditional" metrics of Scopus and so forth.As in the treatment of OA publications via Scopus, the use of Scopus, Mendeley and Plum Analytics immediately places a specific bias on the analysis of research's alternative impact. Such bias is anti-thetical to an "open science" monitoring system. Is there not an opporuntity for something more holistic? Or even for, say, alternative metrics to be delivered by, say, AltMetric? Note again that by using Scopus as the benchmark for published works, the underlying dataset will also be skewed towards the indexing coverage of Scopus.I agree with all the comments so far. I therefore won't restate them, other that to say that my responses to previous aspects of the methdological note above provide practical examples of how there are philosophical and practical issues to using a restrictive sources as the basis for monitoring. It is an odd contradiction: monitor open science using sources that, in part, the open science movement grew in a reaction to. I agree with the other comments that paid/subscription databases should not be used in this analysis (or be the sole source). This is a massive opportunity to espouse the very open science practices that funders and institutions are aiming to encourage or mandate researchers to adopt. I agree with George - this cannot be stressed enough. Signals are incredibly important in this ecosystem at this time. Sending the wrong signal could be extremely detrimental to the critical work of past and current individuals building open science up for the benefit of research not profit margins. I think that citations to open access research within patents would be interesting to capture. Could use lens.org. Number of datasets issues with a DOI and where they cited. I think trying to capture how/where data is reused is of more impactful then just monitoring the number of repositories. I agree with Lluis's comment. This type of survey should be conducted either by the official EU channels or by a neutral 3rd party. Having Elsevier manage this survey could lead to too much bias. As well as survey respondents being unwilling to participate or provide Elsevier with their information. Using the same question issued through multiple channels would be ideal. Not sure what category this would fall under but I would love to see open protocols & materials tracked as well - protocols.io & AddGene are great resources Would be great to see if there is an opportunity to collaborate with CodeOcean Or what about looking at the metrics that the specific journal themselves uses? I agree with all of the above comments. This is an opportunity to untangle many of the issues existing in the current research ecosystem. Do not pass up this opportunity without careful consideration. This is worth doing well - even if it's difficult. Including for profit corporations that have a financial incentive to promote their own brands, marginalise open access players and control the narrative around Open Access is counter intuitive and morally bankrupt. Elsevier should be removed as a subcontractor and all brands Elsevier controls should be removed.Agreeing with the comments: WoS, Dimensions, and Google Scholar must be added.Other than that, citation counts have nothing to do the the form of Open Access.And OpenCitations is missing and must be added.Fields should not be static in an multidisciplinary research community.The greatest omission here is that it doesn't look in to basic Open Science rights: I want to see metrics that allow reuse and redistribution. Metrics should be added to distinguish between "allow reuse, modification, and redistribution" and "does not allow reuse, modification, and redistribution".I want an journal metric that shows the take the DORA serious, and do not actually use the false JIF metric.Number of funders that share grant information as Open Data.Additional metrics: same as above, but also for: 1. software sharing, 2. method sharinAdditional metric: Number of journals that deposit supplementary in Open repositories for long term archival (e.g. Elsevier doesn't guarantee long term availability of supplementary information)And GitLab, SourceForge, BitBucket... those may not be excluded.Yes, other platforms must be included, such as Zotero, ResearchGate, ScienceOpen, etc, etc, etcRemove. This is uninformative and very volatile.Remove. This is extremely volatile.Use multiple resources, not only Plum.No mention of citation of articles in Wikipedia or core research databases. Must be added.Metrics are missing to measure to uptake of Open Science in industry. They must be added.another metric I like to see added: percentages of EC grant proposals, for accepted and for rejected, published #openaccess in a journal like @RIOJournalData quality and consistency between data providers must be evaluated. Example work of this kind: https://osf.io/938wc/The source databases should be (1) open, i.e. not behind a paywall, and with a clear, accessible methodology; (2) independent, i.e. not driven by a group that would benefit from it, but done in the public's interest, which probably means maintained by a consortium including universities; (3) multiple, as cross validation is key in capturing reliable trends. As such, Scopus has to be excluded.Publication sources cannot include Scopus. If Elsevier, subcontracted in this monitoring action, had its own open science agenda, they could alter their own criteria for inclusion of Open Access publications in Scopus and steer conclusions about OA. There is clear, immediate conflict of interest.Publication sources cannot include Scopus. If Elsevier, subcontracted in this monitoring action, had its own open science agenda, they could alter their own criteria for inclusion of Open Access publications in Scopus and steer conclusions about OA. There is clear, immediate conflict of interest.Publication sources cannot include Scopus. If Elsevier, subcontracted in this monitoring action, had its own open science agenda, they could alter their own criteria for inclusion of Open Access publications in Scopus and steer conclusions about OA. There is clear, immediate conflict of interest.Remove sources that may suffer from conflict of interest: Scopus.Remove sources that may suffer from conflict of interest: Scopus.Number of citations is a very tricky measure because it is associated with a time window: it takes some time for papers to be cited, which is largely dependent on the field of research, and on the type of findings. For instance, some truely groundbreaking work may take years to be understood by the community and only becomes cited after a long period of time, while some more minor incremental work may get highly cited if it is published right when a topic is most popular. In other words, number of citation in a short period of time (e.g 2 years) is a poor metric of how useful a publication is.FWCI solves some of the issues of TCS, but inherent biases make it very hard to interpret those numbers.A common issue with FWCI is that the definition of fields is generally perfectly arbitrary, and tends to be based on historical separations.Non-open sources, or especially sources that are controlled by the OSM should be removed to prevent conflict of interest.Access from developping countries for OA publications vs. non-OA publications. There is a chance this can only be evaluated through contact with the individual publishers, or through online surveys.Survey addressed to scientists on reception of OA across scientific fields: Do you trust OA more or less than non-OA? Do you benefit from accessing OA publications? Do you benefit from publishing OA?Survey addressed to scientific journalists, patient organisations, or in general citizens concerned with scientific topics, on the role that OA plays for them in order to evaluate whether OA contributes to making scientific findings available to the society at large.Cost of OA publishing per journal/publisher.Do not limit surveys to those organised by entities having clear conflict of interest on this topic.Note that on the European scene, every country may have their own version of such a license. The first work would be to list these across EU countries.Exclude Scopus and Plum Analytics as there is clear conflic of interest.Exclude Scopus, Mendeley and Plum Analytics as there is clear conflic of interest.Mendeley is not related to Altmetrics or Open Science. This needs to be scrapped.This is not related to Open Science. Remove this metric.This is not related to Open Science. Remove this metric.This is not related to Open Science. Remove this metric.Exclude Scopus, Mendeley and Plum Analytics as sources as there is obvious conflict of interest.Exclude Scopus and Plum Analytics from sources as there is obvious conflict of interest.Exclude Scopus and Plum Analytics from sources as there is obvious conflict of interest.Exclude Scopus and Plum Analytics from sources as there is obvious conflict of interest.Exclude Scopus and Plum Analytics from sources as there is obvious conflict of interest.It is obvious that Elsevier has a clear conflict of interest in their role as subcontractor for OSM, and they should be removed from that position. At minima, they should be barred from using their own entities as sources. In its current state, there is no guarantee of objectivity whatsoever.As an open science advocate, I will not have a elsevier credential, I just erased my mendeley account for that reason. So if it is in Scopus, I would tend to think it is not open science. BTW: a login with orcid instead of twitter would have been nice...I think BASE is already getting good data about this question, although it may all come from crossref (?), may be worth to have a look at scienceopen data, tooMaybe worth contacting universities that need to produce this data too (most of them have contracts to be above 60% OA in 2020 and are finding ways to monitor that). add open data policies, add reagent policies (should newly created material be shared with the community?) also record what implementation procedures are in place (data management plan required, exception allowed, own repository created,what happens if policy not followed...) and give more details (is green OA enough or is gold OA necessary,...)Add open data policies Add policies about material and method sectionfigshare is also producing annual surveys%age of data coming with any metadata. %age of computer readable dataThank you very much to open this step of your work to our comments. I hope you will continue to follow this track and make your methodology open and pre-registered. Please, make sure that you are not creating incentives to use a particular service by monitoring only part of the spectrum (this also includes not to rely too much on twitter...), and prefer open source initiatives over closed products.Elsevier has a strong negative image in connection with open science in the research community. Researchers may be unwilling to participate.To echo the comments above: closed databases are bad for open science. Please do not rely on closed databases (e.g. Scopus). Thanks.To echo the comments above, closed databases, and metrics derived from closed databases, are bad for open science. Relying on commercial services who have a COI is also bad for open science.Using a paid database seems problematic for promoting open communication within the scientific community... The other comments have provided reasons for the above claim.Elsevier products (Scopus) have pretty obvious conflicts of interest?Using closed databases to track open research is rather ironic. More importantly, it is unnecessary and harmful. Please do not use Elsevier as a subcontractor.I think APCs should be included and monitored. At the same time, also pre-print and post-print policies need to be considered. A quantitative indicator to be included can be the embargo period.Number of Software Journals (e.g. JORS https://openresearchsoftware.metajnl.com/)Number of datasets deposited in scientific data repositories (from Re3data?). Number of papers on Data Journals (e.g. Earth System Science Data, Scientific Data, ...) Type of licence applied to data: CC0, CC-BY, ...Question: how can you select/filter "scientific" project from other types of project?Number of Software papers in Software Journals (e.g. JORS https://openresearchsoftware.metajnl.com/ and others)I absolutely agree with the previous comments. Open databases are key and subcontracting Elsevier who has been opposing OA to monitor OA is counterintuitive to say the least. Thank you for the open process.I not quite sure how are "open access" being defined here, but is arxiv.org taken into account ? In my fields of research this is the main source of information nowadays (not the publications in journals).I do not feel that making "# of Scopus publications" the basis for assessing the corpus of scholarly publications will result in a fair assessment of % of Open Access. Scopus requires that a journal publish for at least 3 years and then goes through a rigorous process where one of the criteria is whether inclusion of a journal will increase or reduce the reputation of the Scopus product. VERY many open access journals supported by university libraries are not indexed in Scopus. Many open access publications are new or experimental and would never be indexed in the Scopus database.I would recommend looking at this article that compares Plum Analytics, Altmetric and Crossref Event Data https://osf.io/938wc/ A combination of all three would give a better picture.I see a conflict of interest in only relying on Scopus, Mendeley and Plum Analytics for this.What about Researchfish? https://www.researchfish.net/Evaluating a policy is difficult and the EU is looking for a single body to manage that. Hence the choices for a more or less general body are either Scopus (european), Clarivate analytics (american), of Scholar google (american). Add a touch of lobby from Elsevier (and probably Springer) and you get the result. But the problem is that the result will be completely biased, proprietary and possibly subject to a lot of manipulations, as mentioned in the other comments. The real solution will be to have a specific, independent EU body for the follow-up that will process all the sources mentioned This data should always be used with the caveat that it is likely skewed towards UK/European/Western organisations based on its (limited) funding from Jisc - for example, Japan is listed as having no policies although at least two major funders have such a policy: http://openscience.jp/oa/oa-policy/ As above with OA, This data should always be used with the caveat that it is likely skewed towards UK/European/Western organisations based on its (limited) funding from Jisc - for example, Japan's JST has an open data policy not listed in JULIET: http://openscience.jp/oa/oa-policy/If I read this correctly, this indicator is intended to be a follow-up to the previous Elsevier-led survey. Can the consortium assure us that Elsevier will have no part in the design, dissemination, data-collection, or analysis of this round of the survey. If not, it would contradict Paul Hofheinz’ recent public statement that: “Elsevier ... is a subcontractor to the project, having agreed to provide data. For the record, Elsevier has no involvement in defining, building or controlling any of the indicators that make up the Open Science Monitor, which the consortium is contracted to producI agree that the choice of Elsevier as a subcontractor is very worrying, since their entire business model is either non-OpenSource, or very-expensive-OpenSource, both of which are counterposed to the openness of science, and draw important resources away from good research. It would be more credible to use a more independent subcontractor, which then can choose a more balanced set of resources.get rid of Elsevier ! There data are fake : - "it is critical to understand that the Journal Impact Factor has a number of well-documented deficiencies as a tool for research assessment" SanFrancisco Declaration - ‘Predatory’ open access: a longitudinal study of article volumes and market characteristics Cenyu Shen and Bo-Christer Björk BMC Medicine201513:230 DOI: 10.1186/s12916-015-0469-2 - https://openarchiv.hypotheses.or/2841#more-2841 - http://www.vousnousils.fr/2010/12/24/ike-antkare%c2%a0-le-chercheur-renomme%e2%80%a6-qui-n%e2%80%99existe-pas-457616 get rid of Elsevier ! There data are fake : - "it is critical to understand that the Journal Impact Factor has a number of well-documented deficiencies as a tool for research assessment" SanFrancisco Declaration - ‘Predatory’ open access: a longitudinal study of article volumes and market characteristics Cenyu Shen and Bo-Christer Björk BMC Medicine201513:230 DOI: 10.1186/s12916-015-0469-2 - https://openarchiv.hypotheses.or/2841#more-2841 - http://www.vousnousils.fr/2010/12/24/ike-antkare%c2%a0-le-chercheur-renomme%e2%80%a6-qui-n%e2%80%99existe-pas-457616 get rid of Elsevier ! There data are fake : - "it is critical to understand that the Journal Impact Factor has a number of well-documented deficiencies as a tool for research assessment" SanFrancisco Declaration - ‘Predatory’ open access: a longitudinal study of article volumes and market characteristics Cenyu Shen and Bo-Christer Björk BMC Medicine201513:230 DOI: 10.1186/s12916-015-0469-2 - https://openarchiv.hypotheses.or/2841#more-2841 - http://www.vousnousils.fr/2010/12/24/ike-antkare%c2%a0-le-chercheur-renomme%e2%80%a6-qui-n%e2%80%99existe-pas-457616 Great initiative of the EU to establish an open science monitor. However, in its current form their are some troublesome limitations: 1) # Scopus publications: This point is not clear, as in brackets also other sources are given. I sincerely hope that the EU is not considering only scopus listed publications for the monitor, as several open science journals are not listed in Scopus. For example: "Journal of Open Source Software" and "Journal of Open Research Software".2) Transparency and Open Access are among the most important benefits of Open Science. In contrast, the Scopus database can only be accessed with Elsevier credentials, making it impossible for Citizens Scientist without institutional access to check and validate any results from the monitor. In the light of current events, this might also apply to German and Swedish scientist in general!I don't now what the CWTS method is, but I agree with the comment from Jon Tennant above (https://www.makingspeechestalk.com/ch/Open_Science_Monitor/comm/1397): Its troublesome that tax-money is used to reinvent the wheel when the data is readily available by other services. Also, I hope that CWTS is better than the one currently applied by Elsevier/Scopus, as many OA articles are not reported OA in Scopus or not listed at all (examples can be provided). There is a great risk to greatly underestimate the role of OA by the currently proposed methodology. This definitely needs to be extended to other repositories. The once mentioned by Egon Willighagen above as a start, but many more exist. The Journal of Open Research Software has an extensive list: https://openresearchsoftware.metajnl.com/about/#repo Again, a case were the proposed methodology would under report already existing efforts towards OA.+1 for the comment by Alessandro Sarretta. Worrisome that these Journals are not listed in Scopus... Also, releases in zenodo should be countedI think the uptake of scientific results in Wikipedia is an important aspect not yet accounted for.Please dont rely on one data source, combine Altmetric, Plum, ImpactStory. For the future: Please provide an ORCID login - not everyone in on Twitter and FacebookBeing a paid database and using non-inclusive criteria are not the only concerns here. Using Scopus as the primary basis for assessment will lead to disciplinary inequalities as well. The low coverage of Arts and Humanities journals in Scopus had been pointed out several times in previous studies. For instance, Mongeon and Paul-Hus (2016) (DOI: 10.1007/s11192-015-1765-5) concludes that the Scopus coverage of Arts and Humanities journals (compared to Ulrich’s extensive periodical directory) is less than 20%. Also, in Arts and Humanities disciplines article citations are much less dominant indicators of excellence than in STEM. Therefore the inclusion of DOAB to the sources would contribute to a bit more balanced picture. Assessing life sciences with Scopus and WoS would be acceptable. But social, educational, law, computational and economic sciences are poorly indexed by these databases. Indeed, history, educational, social sciences produce mainly book chapter, books and reviews, type of document poorly indexed by Scopus & WoS. Computational sciences produce mainly conference proceedings that are also poorly indexed by those databases. Relying heavily on local context such as history or human geography are not published in English international journals, and not indexed by Scopus & WoS. Therefore, source2 Therefore sources for OA monitoring should be selected by domain. One size fits all is scientifically irrelevant.Institutional OA repositories are very useful to monitor Gold, Green, Platinum and hybrid OA. Today, many more researchers have an ORCID and some institutions also have ORCID. What about pushing for ORCID and Unpaywall combination for OA monitoring? They are both non-profit raising organizations. 2/3 In fact, non profit tools should be used as the ground of OA monitoring calculation: ORCID, Unpaywall, Zenodo (non-profit EU supported OA and ORD database), institutional databases, and the huge thematic medical PMC and EuropePMC databases. Commercial Scopus and WoS databasesshould may be used to "check" the results at the end. 3/3 By doing so - OA monitoring would *motivate* researchers to create ORCID, and fill in their institutional/thematic databases. - It would weak IF and h-index for carrier assessment and be compatible with DORA signatories - it would allow raise of OA without mandatory national or university laws - The calculation at the beginning maybe not so accurate, but with *time* it will improve, without any *dependance* to commercial Scopus and WoS. They would be only other tools among others for a fair competitive market. The use of Mendeley and Plumanalytics owned by Elsevier such as Scopus is maybe not compatible with EU competition laws. The exclusive use of Scopus, Mendeley and Plumanalytics for OA monitoring may be a violation of competition EU laws by abuse of market dominance, even officially the metrics would be officially performed by CWTS. This configuration should be checked by the EU competition commission. Alternatives are use of Altmetrics, Zotero, Papers, Endnote, Reference Manager, BibTek, etc. In complete agreement with Jon (and also Etienne, but that's a different matter). Twitter is antithetical to science writing/reading.Using the proprietary and for profit Scopus to monitor open science is like asking a fox to guard the henhouse in order to check the freedom of the chickens.Stakeholders engaged in setting up participatory innovation projects (like citizen science projects) should ensure the integration of sex/gender analysis in order to guarantee that innovative processes benefit all segments of population without bias.Awareness must be raised in the OS/OI policy and research community on the relevance of gender and ways OS/OI can mitigate against gender inequality and bias in the various aspects of OS/OI.The Genderaction project has studied the links between OS/OI and gender and a report will be soon publicly available. Key findings have been published in the form of a policy brief http://genderaction.eu/wp-content/uploads/2018/07/GENDERACTION_PolicyBrief5_Gender-OSOI.pdfThe European Commission and its OSPP Expert Group, along with other stakeholders involved in research assessment (RFOs, RPOs) are encouraged to explore how/if the use of new metrics impacts men and women researchers at different career stages and disciplines differently. The choice of metrics during research assessment/evaluation procedures has important implications on researchers’ priorities and strategies, their choice of publication venue and the way they present their CVs.Open Peer Review: As new forms of peer review (like OPR) are being introduced it is important to examine how different practices impact on gender bias and explore ways of adequately addressing them. Scopus not the right source: a) to be included you need to have at least 3 issues, so for yearly journals it means 3 years b) monographs are never included into Scopus, and the coverage of Social Science and Humanites is not so wide. Using just Scopus means cutting off all half of the scientific world. At least DOAB (Directory of Open Access Books) need to be used, alongside with at least DOAJ (directory of Open Access Journals) c) Scopus is a proprietary database, so it can't be the only source. Morevoer, it's owned by Elsevier, which has been lobbying for years against Open Access Having said that using Scopus is not fair as it's proprietary and immediately opens a conflict of interest, counting the number of green publications included in Scopus is absolutely limitating. You will exclude all the monographs, the book chapters, the SSH journals not included into Scopus and all the emerging communication tools as preprints. "Green" open access is a wider concept, measuring it against Scopus is methodologically flawed. What about all the preprints included in arXiv and not yet published in a journal? They are Green OA and won't be counted excluding also Phyisics. Absurd.Again, if you consider only Scopus - with all the proprietary/conflict of interest limitations already noticed - all these indicators should be named not "publications" but "journal articles" percentages.Dealing with citations you can't stick only to Scopus. Web of Science and other databases show different numbers. And, again, you are referring only to journal articles and not to "publications" in general, and only to STEM disciplinesCome on! Do you really want to measure Open science with rankings and percentiles? Open Science means also alternative evaluation. Journal rankings have hugely damaged science and genereated the serial crisis which was at the start of the Open Access movement! Do you really want to measure openness with rankings? Unbelievable. It happens as your subcontractor is the vendor of a ranking-based tool. There are plenty of EU reports stating that research evaluation needs to take into account and reward open choices by authors. See https://ec.europa.eu/research/openscience/index.cfm?pg=rewards_wg 1) ORCID and Unpaywall association is a good suggestion 2) DOAB for monographs 3) DOAJ to include also SSH journals 4) preprints servers 5) tracking the costs, e.g. via OpenAPC https://treemaps.intact-project.org/Publishers' policy. You might include here also book publishers.1) Number of downloads from data repositories 2) Number of CC0 policies associated to datasetsEndogamic indicator! You use an Elsevier owned database compared to an Elsevier owned social network, which by the way most users left the day after Elsevier acquisition. All these Scopus/Mendely indicators are useless as they are too narrowUsing just Mendeley is absolutely misleading. many researchers left it the day after Elsevier acquisition There are lots of other academic social networks to be taken into account.1) preprints, preregistration practices,open peer review, openlabnotebooks, protocols and openly shared workflows are completely absent from this framework but they are the core of Open Science practices 2) you need to include SSH disciplines in this picture 3) scientific blogging is also absent (see e.g. Hypotheses.org) 4) alternative research assessment adopted criteria or best practices, as collected by DORA I agree with the comments above regarding preprints. Consideration of preprints is important and it is not clear how they are included. In addition, only 4 sources of preprints are used, which is quite limiting. OSF preprints and Preprints.org (disclosure - I am director of the latter) are not included. Unacceptable restriction to Scopus. At the very least it should be complemented by Clarivate Analytics data.Unacceptable restriction to Scopus (for all the reasons mentioned in the previous comment). At the very least it should be complemented by Clarivate Analytics data.What would be in the spirit of Open Science monitoring would be to take into account the diversity of research outputs that are accessible via repositories. Using Scopus for Green OA monitoring is IMHO almost an oxymoron. If this indicator is meant to measure all the "publications" in any discipline and language and not only "journal articles", the use of Scopus is highly misleading. Monographs are not included in Scopus nor are the vast majority of journals published in the Social Sciences and Humanities. Scopus is English-speaking oriented, thus excluding publications in other languages. The use of Scopus cuts off half of the disciplines in the ERA. Moreover, Scopus is a proprietary database, so it should never been used as the only source.Using this indicator will result in a very strong STEM-journal-English speaking bias. At least DOAB (Directory of Open Access Books) should be used in order to include Open Access books.Again, this indicator does not take into account all the monographs indexed in the DOAB-Directory of Open Access Books, most of which are Gold OA. It also excludes all the university-based Open Access journals which are Gold even though APC-free. It scarcely represents SSH disciplines as their journals are not indexed in Scopus.Using a single, proprietary database to measure citations is a wrong method. All other available databases or tools should be added. It has to be underlined that again monographs and most of the SSH journals are excluded from any citation database.Measuring Open Science, which tends to be an alternative paradigm to the current scholarly communication system, by rankings and percentiles is a complete nonsense. Applying old categories and indicators to new ways of publishing might result in a distorted - and useless - picture. Moreover, the journal ranking system contributed to create the so called "serial crisis" which was at the beginning of the Open Access movement, so it would be ironical measuring Open Access with the same indicator it was born to fight. And, again, books are cut off from citation counts. You might consider also adding publishers' policies, in order to also include book publishers.Track the scientific blogosphere, there might be blogs or projects addressing societal needs and involving societal actors/citizens. Again, SSH disciplines seem to be neglected.The use of Mendeley is misleading as it is not the only academic social network. Moreover, coupling it with Scopus results in a sort of "endogamic" indicator referring just to the Elsevier sphere.The picture coming out is highly biased towards STEM diciplines, English-speaking resources, and limited to only journals. To include the SSH: 1) Create a new set of indicators specifically referring to monographs, using e.g. DOAB as a source. 2) Include specific SSH-oriented tools like Isidore, which address the multilingualism issue. 3) Track SSH-oriented projects developed by open infrastructures like DARIAH or CLARIN. 4) Include scientific blogs to track the open debate in the SSH, using e.g. Hypotheses.org.If we want Open Access results then we must ensure that the infrastructure underpinning them is built on the same principles. Placing the monitoring of Open Access output exclusively in the hands of a closed, for-profit service that ignores large swathes of academic output (e.g. non-English, AHHS-fields) not only undercuts the monitor's results; it will damage the credibility of the EU open policy goals themselves.We all are aware of the importance (and power) of Scopus and WoS, but when will the EU open itself really to alternative indicators? Monographs are absent or at least poorly represented, thus leaving SSH production clearly unrepresented. Strongly advise to use at least DOAB references.It is stated in the draft methodolological note that "the study covers all research disciplines, and aims to identify the differences in open science adoption and dynamics between diverse disciplines" (p. 6) I think it is hard to find someone today argumenting that one data source can accomplish this, a variety of sources is needed (and preferably open as stated in comments above). To pick one will take us back to the problem of searching the key where the light is, and this is not in tune with recent developments in the scientometric indicators field such as the Leiden manifesto (principle 6)Why are all the sources listed for indicator P when, as I understand it, it will be based on Scopus solely? The same goes for the indicators TCS, FWCI, TP1/TP10.I agree with what is mentioned in the comments above, regarding diverse data sources (and would also like to see the principle of open sources and methods in use). I would also like to challenge the use of altmetrics as one concept used as indicators below Open collaboration. It is not clarified what is meant by Open collaboration (btw researchers, with larger community?) in the draft methodological note so it is hard to tell which indicator would be suitable: Mendeley could be a choice for collaboration btw researchers, preferably alongside other datasources so not to rely on only one.I assume this will be based on Mendeley as the other readership scores. To divide them into subcategories may be misleading as we know that only a certain part of users state their career status and if they belong to academia or not. The users also introduce bias for discipline and career age that has to be expressed clearly (see for example: https://wlv.openrepository.com/bitstream/handle/2436/620754/MendeleyReadershipCountsTimeAndDiscipline_preprint.pdf;jsessionid=332804C063F0DD8BA15C11EBBEB96D78?sequence=2) Other sharing/reading source could complement MendeleyI agree with the comments above, and would also like to add that it has to be stated how twitter indicators point to open collaboration (the title of this section)? One finding is that publications tweeted are generally in categories where we don't count citations (news, editorials) and as such maybe could be used for an alternative attention indicator, but not necessarily a collaboration indicator (https://www.altmetric.com/blog/uncitable-research-is-infinitely-more-tweetable-or-what-kind-of-publications-get-shared-on-twitter/)I agree with many of these comments above (and I do not appreciate to have to login to make a comment, on top of that either with Facebook or Twitter). It is not good to base this only on Scopus (Elsevier).I agree with many of the comments above (Bianca Kramer and Jon Tennant) as well as others. Main point is: how can EC / EU create / use an open science monitor basing itself on one source (Elsevier). Elsevier does NOT represent openness.It's disappointing to see that the only outlets mentioned, at least as far as I could discern, are journals. In many disciplines the or a main form of publication is the book - whether single-authored, a collection of related papers on a given theme, or a specialist catalogue or textual edition and commentary. It would be good to see this form of publication included here. I speak both in a personal capacity and as a member of the UUK OA monograph working group.I'd echo Nigel's comment (also as a member of the UUK OA monographs working group), particularly given that significant funders are signalling intent to develop OA policy in the direction of long form publications.We thank the commission for the initiative of the Open Science Monitor. This tool must be free of bias and trustable. The methodology raises several questions. We detail in a 5 pages letter our remarks and recommendations for future work. We suggest to build the next generation of open science monitor with a variety of open data sources. Bernard Larrouturou, Director-General for Research and Innovation and President of the French Open Science Committee. https://www.dropbox.com/s/pyi1urbwp143c85/France%20-%20Feedback%20on%20EC%20Open%20Science%20Monitor%20Methodological%20note.pdf?dl=0 Two main problems with Scopus as baseline for total pubs: it is not transparent and its coverage it not sufficient in terms of languages, document types and fieldsQuestions is what kind of statements you intend to base on this variable. To what extent is this valuable for assessing the adoption, penetration or effect of open science? Publications with open licenses; publications for which a preprint is available; publications with open peer review reports; publication with open peer review identities; publications with open peer review editor/reviewer/author communicationPublications of which the citation data are fully open and complete (Crossref)OA availability of sources cited in the papersresearchers attitude towards open accessresearchers attitudes towards types of open licensesshare of institutions having a repositorycountries having scholarly open access rights in their copyright lawNumber of fully open access and openly licensed scholarly booksNumber of registered and open peer reviews (publons)Number of institutions with an established open science communityNumber of institutions having signed DORANumber of published having signed DORAnumber of funders having signed DORAshare of institutions having a strong repository mandate (roarmap)Number of journals flipped to full OAnumber of open access available proposals and decision lettersFunders with data and code sharing requirementsFunders with diversity statements/statistics regarding their review panelsopenly licensed open access of back volumes back to vol 1 issue 1percentage of papers in the full publication history of the journal that are OAjournals with policies preventing re-enclosure after selling the title to another publisherNumber of diamond/platina journals and papers thereinIs more better? Number of papers on reporting research reusing existing data setsNumber of papers based on openly available (raw) dataNumber/share of papers having an statement on data availabilityAdditions of new data/datasets to data repositories, by fieldOpenly avialable datasets, by licenseNumber of journals by TOP guidelines variables and levelNumber of data citations Carry out your own independent survey or at least perform a systematic review of all survey research on researcher data sharing viewsnumber of data papersNumber of software citationsrelated: number of articles citing RRIDsmembers of national and international citizen science organizationsmentions of citizen science in newspapersmentions of citizen science in full text of articlesAFAIK mendeley "reads" include instances of downloads of bibliographic data only and thus are a very questionable indicatorMendeley readership data are highly questionableInstitutions having open science in their mission statementInstitutions having open science in their researcher code of conductNumber of journals supporting registered reports document typeNumber of preregistrationsjournals indicating contributor rolesresearch funding proposals in which open science is explicitly mentionedacademic job ads mentioning/asking for open science experiencejobs for data managers, data consultants and data stewards at universitiesNumber of papers published in journals that explicitly state to select papers just on rigour not on noveltynumber of replication studies publishedNumber of papers with a plain language abstractNumber of papers with abstracts in 2 or more world languagesnumber of papers stating that only open source software is usednumber of scientific conferences posters shared openly in e.g. Figshare, Zenodo etc.Number of conference presentation slides shared openly