CrossRef as source metadata for literature in the arts and humanities

For those seeking citations for literature in the arts and humanities, the most prominent tools have limitations of discipline, language, geographic and/or open availability. The authors of “Crossref as a bibliographic discovery tool in the arts and humanities” investigate CrossRef as a potential source of literature on the arts and humanities. CrossRef has the advantages of being an open source, community-governed, non-profit and globally adopted platform for sharing research objects. Its mission is to make “research objects easy to find, cite, link, assess, and reuse.” The authors examine Dimensions, Google Scholar, Microsoft Academic, Scopus and Web of Science as potential sources of research in the arts and humanities. Google Scholar has the most comprehensive coverage and is free to use, but these researchers dismissed it for lack of widespread use. Web of Science and Scopus are both large and widely used, though their coverage tilts towards English language, STEM research produced in the Global North. Ultimately the authors choose to compare CrossRef with Scopus using the European Reference Index for the Humanities and Social Sciences (ERIH Plus) journal title list as the basis for subject comparison because CrossRef does not include subject metadata.

The authors found that CrossRef covered more ERIH Plus journals than Scopus (80% to 49%) and better coverage of journals published in Eastern and Southern Europe, Africa, Asia and Latin America. They note the significance of this in the arts and humanities where research often has a regional or national focus. The disadvantages of CrossRef as a tool come through in the metadata available. The lack of subject metadata is a major drawback for any search that doesn’t begin with a known citation. Reference linking and cited-by tools depend on publishers depositing reference metadata with articles, and the study found that only half the journals have article reference lists. This should improve as CrossRef required publishers to make reference lists open in 2022. The inclusion of abstracts and author information also varied depending on language of the document. The study authors conclude that CrossRef has its strengths in coverage for the arts and humanities, but also has its problems as a discovery tool. They lay the responsibility for this with publishers and encourage further study of publisher motivations and practices for sharing metadata.

Measuring open science progress with indicators

Open science is gaining attention as policy initiatives, such as the UNESCO Recommendation on Open Science and the Nelson Memo, and the open access/open source movements develop and mature. UNESCO defines open science as:

“an inclusive construct that combines various movements and practices aiming to make multilingual scientific knowledge openly available, accessible and reusable for everyone, to increase scientific collaborations and sharing of information for the benefits of science and society, and to open the processes of scientific knowledge creation, evaluation and communication to societal actors beyond the traditional scientific community. It comprises all scientific disciplines and aspects of scholarly practices, including basic and applied sciences, natural and social sciences and the humanities, and it builds on the following key pillars: open scientific knowledge, open science infrastructures, science communication, open engagement of societal actors and open dialogue with other knowledge systems.”

UNESCO Recommendation on Open Science (p. 7)

Stakeholders in research and communication – researchers, funders, policy-makers, research institutions, publishers, technologists, librarians, private and public interests, etc. – now have the task of understanding how open science policies and practices are adopted. This requires a common understanding of indicators, followed by data collection and analysis.

PLOS (the Public Library of Science) conducted an Open Science Indicators (OSI) study comparing how data, software code and preprints are shared from 60,000 research articles it published with those of 6,000 open access articles on similar topics published by others between January 2019 and June 2022. The goals of the study are to:

  • Improve ability to measure success of solutions
  • Understand different communities and co-create new solutions
  • Support Open Science initiatives outside PLOS with reliable data
  • Increase adoption of Open Science practices globally

You can read more about the OSI study here and here (Scholarly Kitchen post) and download and use the data. Importantly, PLOS is demonstrating open science practices by openly sharing its study guiding principles, methodology, definitions, data and analysis. Initial data analysis shows promising results on accuracy of data, ability to continue and expand study of indicators at scale, and increasing adoption of open science practices. PLOS is clear that these indicators are designed to monitor practices but NOT to rank journals, researchers or indicators. The OSI study is a robust offering to begin our understanding of open science adoption and impact.