Keeping Up With... Altmetrics

Altmetrics

This edition of Keeping Up With… was written by Robin Chin Roemer and Rachel Borchardt.

Robin Chin Roemer is Instructional Design and Outreach Services Librarian at University of Washington Libraries, email: robincr@uw.edu. Rachel Borchardt is Science Librarian at American University Library, email: borchard@american.edu.

Introduction to Altmetrics

Cites. Tweets. Downloads. Views. In today’s digitally-transformed higher education landscape, the lines between popular and scholarly influence are blurry at best. More and more, scholarly communication is moving away from the strict sphere of conferences and published literature and into Internet-enabled arenas like blogs, institutional repositories, online interdisciplinary communities, and social media sites.

Into this setting, enter altmetrics. Altmetrics is an emerging category of impact measurement premised upon the value of “alternative metrics,” or metrics based distinctly on the opportunities offered by the 21st century digital environment. Originally defined in contrast to the more established field of bibliometrics, altmetrics is fast becoming a fluid area of research and practice, in which various alternative and traditional measures of personal and scholarly impact can be explored and compared simultaneously.

In this Keeping Up With… edition, we look at key points in the rapid development of altmetrics, from its 2010 origins to its more recent relevance to librarians and administrators.

Further Readings:

Bibliometrics vs. Altmetrics

One of the first hurdles in understanding altmetrics is unpacking its designation as an “alternative” to bibliometrics, i.e. metrics derived from the quantitative analysis of scholarly publications, such as citation-based calculations of article and journal influence.

Since the mid-20th century, Impact Factor and citation counts have been the standard means for evaluating the impact of journals and journal articles. More recently, however, we have seen the increasing use of alternative citation-based metrics for journal quality that use different algorithms to rank journals against others in their field, most notably the SCImago Journal Ranking and Source Normalized Impact per Paper (SJR and SNIP) metrics. Even Google Scholar has begun producing its own journal rankings for scholarly disciplines.

However, all of these innovations are still considered bibliometric measurements. The main distinction between bibliometrics and altmetrics is the type of data that is being used, with the main theory behind altmetrics being the use of new, different types of data to determine impact and quality. Moving beyond citation-based views of scholarly influence, altmetrics gives authors data like article pageviews and number of downloads, along with social media and article sharing indicators of impact that complement citation-based metrics.

Further Readings:

A Growing Community

The rise in altmetrics stakeholders has been significantly shaped by the rise of the open access, scholarly communication, and open data movements. All of these movements share a common goal of transforming the current state of higher education and of research, and to do so by utilizing 21st century tools such as digital technology and social media. These movements increase the availability of data for altmetrics while giving researchers incentive to use and contribute to altmetrics tools. For example, Public Library of Science (PLoS) was one of the first journals to give its authors access to article-level metrics, while Mendeley, an altmetrics-enabled citation management and networking tool, offers unique data regarding article readership to users. Single user tools like ImpactStory have also emerged as a way for researchers to capture their impact through altmetrics data channels. Likewise, entities like Altmetric.com have developed apps that demonstrate how altmetric data can enhance bibliometric data through integration with Scopus and other websites. Recently, higher-level altmetrics tools like PlumX (now part of EBSCO, as of January 15th, 2014) have emerged that summarize and compare the impact and quality of not only individuals, but research centers, departments, and institutions around the world.

With the significant rise in interest for altmetrics has come a recongized need for standards and best practices. The National Information Standards Organization (NISO) is currently undertaking an initiative to better understand these needs and, ultimately, to help further the altmetrics movement.

Further Readings:

Opportunities & Controversies

Like bibliometrics before it, altmetrics has generated its fair share of praise and criticism. Chief amongst the latter has been the idea that because altmetrics do not necessarily measure in-depth scholarly engagement, they can either be “gamed” (e.g. raised through disingenuous interaction) or lack the inherent value of citation-based metrics. Indeed, several studies have been conducted to determine whether an article’s views and “tweets” correlate to its number of times cited, with variable results. Lately, however, with greater recognition of the potential for gaming across all types of impact indicators, attention has shifted toward the additional spectrum of impact that altmetrics can provide beyond the bounds of academia.

A key issue that has arisen for altmetrics is the growing demand for quantitative evidence of impact from scholars across the disciplines. One of the many criticisms previously leveraged against bibliometrics is that they are calibrated based on the interests of traditional scientific researchers, and not researchers in the non-sciences or non-traditional fields where impact cannot be measured accurately via citation data. By bringing into consideration a new array of metrics based on consumption, user engagement, and quality, altmetrics tools claim to offer a broader picture of influence for researchers within and across specialities. However, this potential has yet to be practically realized due to longstanding disciplinary differences in how impact is defined, plus a growing recognition that scholars may need discipline-specific altmetrics resources, particularly in the humanities.

Further Readings:

Library Involvement

Academic librarians have been and continue to be involved with altmetrics at every level, a fact grounded in overlaps with open access, research practices, and collection development. Some libraries, for instance, see altmetrics as an way to persuade scholars to contribute to an institutional repository, where altmetrics data can be harvested. Others have added altmetrics to their instruction and outreach efforts, making sure scholars and administrators at their institutions are aware of both the benefits and limitations of altmetrics, or providing a forum for discussion of these issues. Librarians have also been active in online discussions and national meetings, vocalizing their perspective on the adoption and advancement of altmetrics in relation to larger information and access issues. Some have even begun incorporating altmetrics into their own professional or scholarly work, using altmetrics data to learn more about their collections, researchers, or impact on the field of LIS.

As altmetrics continue to grow and evolve, it is essential that librarians keep abreast of developments to help move the conversation forward and represent the needs of faculty, administrators, and information professionals.

Further Readings: