Research Reveals PlumX from Plum Analytics is Not Just Altmetrics
Recently, Plum Analytics President Andrea Michalek spent time with author J. Michael Lindsay, MSIS, AHIP, Associate Professor, Serials and Electronic Resources Librarian, Graduate Preston Medical Library, The University of Tennessee Graduate School of Medicine. Mr. Lindsay authored an article on PlumX (toll-access only) for the Journal of Electronic Resources in Medical Libraries, published in 2016.
Mr. Lindsay comments on PlumX, but also on altmetrics at large. He points out, among other things:
- How the definition of research outputs has expanded
- How context plays a role in metrics
- Whether a total metric score for research is helpful or not
- Whether tenure and promotion may be influenced by altmetrics
Here are some excerpts and observations from his article, titled “PlumX from Plum Analytics: Not Just Altmetrics.”
Are altmetrics poised to replace citations or other measures of scholarly impact? No. However, they supplement these views with a valuable 360-degree look at metrics.
“…as promising as Altmetrics are, librarians and researchers do not appear to be ready at this point to completely replace previous measures of scholarly impact with these new measures. This is what makes the idea behind Plum Analytics’ PlumX Metrics particularly interesting. PlumX provides a single interface to view all relevant metrics for an article.”
The effect social media is having on the measurement of scholarly works. Context plays a role.
“Social media has given researchers the ability to share their work almost instantaneously. Consequently, this has allowed for some measures of the impact of research to be known far more quickly as well. What has been missing has been a means of presenting all metrics, in context, in as close to real-time as possible. PlumX uses data visualization to show more traditional measures such as citations alongside newer measures, such as downloads, abstract views, online comments, and social media likes, shares, and tweets.”
“PlumX places Altmetrics in their proper context to show a more complete picture of scholarly impact. Any institution that needs to track scholarly impact would benefit from this product, particularly if that institution maintains an institutional repository of the work of their scholars.”
How end users can be the focus of altmetrics.
“The approach of Plum Analytics is that the end user decides which metrics are important. Institutions provide Plum with data from institutional repositories, such as digital object identifiers (DOIs) for the articles that their researchers publish. Plum takes this information from there, automating the process of finding links to researchers and content through ORCID IDs, PubMed IDs, International Standard Serial Numbers (ISSNs), and many other data points to get at the information that is relevant to the user.
Whether or not a score helps a work—or hinders users.
“It is interesting to note that while the data are presented together in one place, there is no attempt to make a statement about the data as a whole—to calculate a meta-impact factor, if you will. The approach, clearly displayed throughout, is to find the data, aggregate them together, and let the end user decide which parts of the data matter most. To Plum, the more important question is from the researcher’s perspective; that is, what form does the researcher see their work taking?
Tenure and promotion may be influenced by altmetrics.
“Given the changes in how researchers use and share information, taking into account a broader measure of research impact seems prudent. It is not inconceivable to expect that Altmetrics and other measures will eventually have a role in tenure decisions and other judgments about scholarly production.”
How research has changed from previous presentation forms to the multiplicity of today’s works. Once again, context plays a role in the interpretation of an artifacts metrics.
To understand what Plum X does and how, it is important to understand what is tracked. Plum X does not simply track articles; it broadens the scope of items of interest, defining these items as “artifacts.” Artifacts are defined as “any research output that is available online.”
“Artifacts include research articles, case studies, abstracts and book chapters, as one might expect. However, the term artifact also includes such disparate works of scholarship as presentation slide sets on Slideshare, video recordings on YouTube, data sets, file sets, audio recordings, figures, images, government documents, musical scores, maps, and pre-prints.”
“Plum Analytics takes the approach of gathering all metrics that researchers and institutions care about. However, overlapping data is not a concern as the context of how an artifact is shared is more important than use of a single instance of that artifact. To explain, the same article may be cited in a peer reviewed journal, shared on Facebook, bookmarked in Delicious, and have the reference URL converted into a Bitly URL and Tweeted out; these all represent distinctive types of uses. The comparison is really between apples and oranges; different artifacts mean very different things when understood in context.”
A broader perspective on the impact of research, presented quickly.
“Plum Analytics’ PlumX Metrics represents an intriguing shift in thinking about alternative metrics. By viewing Altmetrics side-by-side with traditional measures, users get a broader perspective on the impact of research, and this in turn helps lend a greater legitimacy to non-traditional measures. The product also changes how metrics are viewed through the creative use of graphs and visualizations, all with the goal of determining and measuring research impact more quickly.”