There’s been a lot of debate about the validity of impact factors over the years (and there have been many attempts to measure impact but none wholly accurate). Just this week on Twitter, the discussion took off again after the publication of an article by Jennifer Howard entitled “Scholars seek betters ways to track online impact” in The Chronicle of Higher Education (January 29th 2012 ) which highlights the work on “alternative metrics” done by Jason Priem (a graduate student in library sciences at the University of North Carolina) who helped write a manifesto on “altmetrics” (see: http://altmetrics.org/manifesto/).
The advent of the social web and open access initiatives have resulted in a myriad of possibilities of sharing and communicating research output almost instantaneously and have changed the ways in which researchers are disseminating their work. It seems that there is a loosening up of the neck tie in some quarters as no longer is there an absolute reliance on peer-review, citation counting measures (which, as the altmetrics manifesto states, are in themselves a narrow and inaccurate measure because the context, especially as the socio-political impact in which a work is cited, is often ignored and the impact of an article is often not felt for several years after publication) and journal impact factors (which have always been questionable anyway). Certainly, the altmetrics movements states that we can no longer rely only on the traditional bibliometrics to measure impact and re-thinking impact measurement for research is clearly a step in the right direction. However, as the ‘conversation’ on Twitter and the various blogs on this suggest, there is much work still to do in assessing research impact.
The perennial question of the relationship between ‘counts’ and actual impact is still being asked – popularity is not the same as excellence. What ‘altmetrics’ attempts, according to its manifesto, is to capture what is being read, bookmarked, shared, discussed and cited online in order to provide a pattern for analysis. And it is not just for whole articles but also what is referred to as the “nanopublication” where “the citable unit is an argument or passage rather than [the] entire article.” Also, data, code and design and the re-use of these elements need to be captured in order to measure the true impact of the original:
These new forms reflect and transmit scholarly impact: that dog-eared (but uncited) article that used to live on a shelf now lives in Mendely, CiteULike, or Zotero. The hallway conversation about a recent finding has moved to blogs and social networks – now, we can listen in. The local genomics dataset has moved to an onine repository – now, we can track it. This diverse group of activities forms a composite trace of impact far richer than any available before. We call the elements of this trace altmetrics.
Heather Piwowar, who is also working on alternative metrics, states in her blog post on ResearchMix, research impact now has flavours (and there could be as many as the 31 – the same number of flavours as Baskin & Robbins claims to have for ice cream!) that need to be captured. This is where ‘altmetrics’ comes in – Piwowar says, it isn’t about comparing flavours as one is no more important than another; it is about capturing the flavour(s) so that they form a complete picture. In other words, as the scholarly information landscape becomes murkier with references abounding on social networking sites, traditional citation counts alone can no longer be the only measure of academic excellence, weightiness or impact.
Altmetrics is in an early stage of development and there is much work to be done in testing and evaluating the tools that will measure impact in the digital landscape. The acceptance of these tools will likely require a (significant?) change in institutional mind-set especially among traditional researchers who may not be using the social web to communicate their research. Either way, the development is important in the context of advances in technology (especially in the area of the semantic web) and may be timely given that we will be undergoing the Research Excellence Framework in 2014 and need to consider impact in all its guises.
The Altmetrics group continues to launch tools – see Total Impact, for example and the Altmetric Explorer. Until we move towards measuring impact in this new way, we can continue to use the traditional methods, i.e. the various citation indexes, Google Scholar and even Ann-Wil Hartzing’s software Publish or Perish which analyses the citations on Google Scholar according to established citation measures. Alternatively, we can use both methods alongside each other to demonstrate ‘total impact’ and to “uncover the invisible in research”.