Todd Carpenter
Nearly three years to the day after Plum Analytics was acquired by EBSCO, Elsevier announced this morning that it has acquired the altmetric data aggregator’s brand, tools, data, and the team working on the project. The deal reinforces Elsevier’s collection of metrics offerings. It also provides some insight into the state of alternative metrics and their relationship to the multi-stakeholder academic community.
Rather than portending something amiss in the altmetrics space, this deal appears to signal a developing understanding of where altmetrics sit in academia and who is most interested in them and why.
Read the full article
Lauren B. Collister, PhD (Scholarly Communications Librarian, University Library System, University of Pittsburgh) and Timothy S. Deliyannides, MSIS (Director, Office of Scholarly Communication and Publishing and Head, Information Technology, University Library System, University of Pittsburgh)
PlumX provides both altmetrics and traditional metrics from a variety of sources. Showing these side-by-side allows a researcher to see all of the different impacts of their article in one place. This presentation allows researchers to see traditional and alternative metrics side by side and evaluate them as they see fit. It does not place one metric above another as more valuable; instead, it presents categories that may be of interest in different situations. A researcher interested in knowing who is talking about their article might find the Social Media or Mentions categories most valuable; a tenure and promotion committee member may be interested in evaluating the traditional metrics in the Citations category and the “buzz” around an author’s work using the other four categories. PlumX provides numbers that can be evaluated according to the user’s needs; there are no “scores” like those present in Altmetric, ImpactStory, and many other outlets, which can obscure the meaning of the data below the score.
Read the full article
Ian Chant
So you’ve established an institutional repository (IR), where users can put papers, theses, and experimental data on file, making it easily accessible to the larger world. While getting an institutional repository up and running is no small feat, it’s only the first step. To make the most of this tool, you have to fill it, and that means getting ongoing participation from faculty and students.
Read the full article
Peter Nieuwenhuizen, Senior Consultant Information Management at Rijkswaterstaat in the Netherlands
Peter Nieuwenhuizen on November 25, 2015
Alternative bibliometrics, article-level metrics (ALM) or altmetrics have been in use since 2012 to measure the impact of a specific article, rather than the impact of a journal as a whole. They can be used for measuring the social impact of non-scholarly publications like governmental ones. Several applications are used for this new way of measuring, like PlumX and Altmetrics. They measure the use of the information and the social impact by monitoring and measuring social platforms like Facebook, Twitter, YouTube, Vimeo, Slideshare, but also Mendeley, Reddit, Wikipedia, Delicious, Bitly, Scopus, WorldCat – and many others.
Measuring by alternative metrics is a quick method and the results are easy reusable for visual representation.
For this reason Rijkswaterstaat, the executive governmental organisation of the Dutch Ministry of Infrastructure & Environment, performed a case study.
Read the full article
Proceedings of the 2015 ORCID-Casrai Joint Conference, F1000Research 2015
We made the decision to use PlumX after evaluation of it in comparison with a similar product from Altmetric.com called Altmetric for Institutions. While there is some overlap in the metrics types and sources used by the two services (primarily those from social media sites), we also found significant differences. Altmetric.com has emphasized finding meaningful mentions of published articles and conference papers in the news media, on blogs, and in government documents. However, the broader net that PlumX throws in other areas struck us as likely to give us a more comprehensive view of the varied types of research output produced by KAUST authors (such as computer code, datasets, videos and presentations), and also the varied metrics associated with their usage (downloads, views, citations, etc.). PlumX’s use of the ORCID public API to remain regularly updated with new works added to a researcher’s profile also presents an attractive method of showing researchers a benefit of using the KAUST repository, ORCID and the KAUST/ORCID integration.
Read the full article
Guest blog post by Plum co-founder, Andrea Michalek on August 26, 2015
The landscape around alternative metrics has been evolving rapidly. Going from a twitter hashtag in 2011 to a key component in research evaluation at many institutions, metrics that move beyond citation counts and Journal Impact Factor are here to stay.
Some key excerpts are below. Read the full post for more.
Citation analysis is a lagging indication of prestige. Citations can take 3-5 years to accrue the critical mass necessary for meaningful analysis.
Not all influences are cited in an article. Research outputs other than a journal article are typically not cited.
Securing research funding is getting more competitive. If researchers can show that their recent research is generating a lot of interaction in the scholarly community, that information can provide an advantage in this tight funding environment.
Be Comprehensive. When you start with the question, “What do you consider to be your research output?” and you ask that across many different disciplines, you start to build a base that tells a more complete story of the outcomes of research.
Measure at the Artifact Level – Not the Journal. There are bad articles in high impact factor journals, and great articles in low impact factor journals. Even if Journal Impact Factor (JIF) were a perfect measure of the quality of a journal, it would still be an inappropriate measure of the quality of a particular article in that journal.
Better Visualizations Lead to Better Understanding. The key to quickly navigating complex data and be able to gain insight from it, is to use elegant and simple visualizations to do the hard work for you.
Article Level Metrics are just a Building Block. Insight comes from being able to pull article level metrics together to tell the stories of the people, the groups that they are affiliated with, and the topics they care about.
Metrics that Keep Pace with Online Scholarly Communication, The data that we collect and the way we represent it, needs to capture today’s interactions and yet be flexible for the future.
Read the full article
Library Journal, January 2015.
While there is no denying the value to research universities, smaller institutions can also benefit from PlumX’s available tools and widgets to showcase new departments, programs, or publications. Analytics can be easily downloaded in PDF or PNG format and embedded into reports, grant applications, PowerPoint presentations, and even CVs. These widgets are available at the group, individual author, and artifact level. Groups can be departments, labs, museums, journals, custom collections, and so on.
Read the full article
JMLA — Reviewed by Robin Champieux, January 2015.
PlumX is a powerful tool: both the data and the user interface can support multiple use cases and research questions. For example, Oregon Health & Science University (OHSU) Library is exploring how to use PlumX to help individual researchers uncover the full impact of their work, especially among different populations. OHSU wants to enable researchers to tell more nuanced stories about their science, especially in the context of funding proposals and the new biosketch format that the National Institutes of Health is piloting. In this sense, successfully using PlumX and altmetric data more generally is very dependent on asking the right questions and thinking creatively and analytically about what the metrics are indicating.
Read the full article
Charleston Advisor, January 2015.
The metrics and analytics uncover the qualitative stories about what is happening with research output. Revealing the over-hyped or ‘popular’ research is another example of a story that the metrics allude to. For instance, if you look at a PlumX report on the output of a researcher and see that one article stands out in its social media interactions, you do not just take the number and say that that is necessarily good—you look behind the numbers and see that the paper was controversial, over-hyped, slammed by colleagues, etc.
Read the full article
Philadelphia Business Journal, January 23, 2015.
For the past 15 years, she’s persevered through the bias and pushed her way to the top in a field that’s been mostly dominated by men.
Now, she’s making it her mission to “be out there as a female senior technologist … just to be an example,” she said. These visible roles allow girls and women alike to think about the possibilities tech careers bring.
Read the full article
Outsell, October 30, 2014.
- Evaluation criteria for three altmetrics offerings: Altmetric Explorer for Institutions from Altmetric, ImpactStory, and Plum Analytics’ PlumX.
- Ratings of each product for quality and performance, organisational factors, pricing transparency, sales resources, and customer service.
- Suitability of each offering for enterprise research communities.
- Imperatives for information managers looking to select an altmetrics service provider.
Spoiler alert: PlumX ranked highest.
Read the full article
Research Information, October/November 2014.
She [Andrea Michalek] said there is plenty more to do. ‘To this point, we have worked hard on gathering as many metrics about as many research outputs as we could. Now we are moving beyond gathering and organising this information, and are creating ways to make it simpler to gain insight from this raw material in the forms of reports and other mechanisms,’ she noted.
Read the full article
Against the Grain by Andrea Michalek & Mike Buschman, April 2014
Looking at alternative metrics can help your collection. By knowing in which journals your faculty publishes, you can ensure that you subscribe to these journals. Not only will your faculty be appreciative of this, but also your students will have access to research that is important to your institution. In addition, you will have a better understanding of the usage and other categories of metrics about your resources beyond your own institution’s COUNTER statistics.
Read the full article
Editorial Office News of the International Society of Managing and Technical Editors by Andrea Michalek, Mike Buschman & Marianne Parkhill, May 2014
There is good news here for publishers. Altmetrics do not have to be just for authors. Since a metric dashboard such as PlumX consolidates metrics at any level, including journal or issue, publishers have unprecedented information to help manage their publications. They can now answer a series of questions including:
- How are issues performing over time?
- Are we recruiting the right authors?
- Can we tell if a discipline, journal, or author is “on the rise”?
- Are our competitors promoting their articles and authors better than we are?
- How do we provide more value to our authors?
Read the full article
The Scholarly Kitchen by Judy Luther, February 5, 2014
Plum Analytics’ new position as a wholly owned subsidiary of EBSCO is a significant partnership with far reaching impact that will benefit both companies and advance the development of altmetrics. When a company the size of EBSCO invests in an emerging area it serves as an endorsement of this new field and acknowledges its anticipated potential.
Read the full article
Library Journal by Elizabeth Michaelson, November 7, 2013
The scholarly landscape is undergoing vast changes, with open access revolutionizing how publishing happens and how quickly and easily patrons can access new information and thinking on various topics. Scientific writing is probably the best-known example, with services such as PubMed gaining great attention, but other fields, such as the digital humanities, are not far behind. Still, though, tenure and other professional recognition have tended to be based on traditional metrics such as the impact factor of the journals in which a scholar publishes. Plum Analytics, a company founded in 2011 by entrepreneurs Andrea Michalek and Mike Buschman, has started to change all that, leading to its nomination as most ambitious database by LJ’s reviews editor Henrietta Thornton-Verma.
Michalek and librarian Buschman led the team that developed ProQuest’s Summon discovery system; PlumX builds upon their information-retrieval expertise to help academics gather information on all kinds of scholarly activity—from papers in peer-reviewed journals to social media mentions—into a broader picture of an academic’s professional life. A subscribing institution can use PlumX in many ways, from vetting job applicants to forming a “big picture” of the institution’s research activity for funding purposes or to draw students, and individuals who create a profile will have the benefit of getting credit for many more types of work than was possible before.
Read the full article
Library Journal by Bonnie J.M. Swoger, August 29, 2013
PlumX is an analysis tool aimed at helping libraries and research administrators understand the influence of their researchers’ work by using newer alternative metrics, called altmetrics, alongside traditional measures of research impact…
PlumX seeks to provide administrators with a bird’s-eye view of the influence of their organization or group by providing access to traditional citation metrics and newer alternative metrics in one interface…
PlumX may be of interest to academic libraries, special libraries, research support offices, and anyone seeking to better understand how the research output of their organization is being used.
Read the full article
The Chronicle of Higher Education by Jennifer Howard, June 3, 2013
Timothy S. Deliyannides is director of the office of scholarly communication and publishing at Pittsburgh and head of information technology in the library system there.
A critical part of the library’s job is helping the research faculty “understand and be able to measure the impact of their works,” he says. “And since much of their work takes place online now, and not just in the cited periodical literature, there are lots of new ways to measure their impact.”
“It’s really useful for representing the immediacy of impact that was hidden before,” Mr. Deliyannides says. The Pittsburgh library has been fairly quiet about the experiment. “We’re not really on a crusade to change any of the university’s normal processes for tenure or review,” Mr. Deliyannides says. “But we hope people will think of new ways to use this data. We do feel it’s valid data and something that hasn’t been gathered or reported before.”
Read the full article
ASIS&T Bulletin by Mike Buschman and Andrea Michalek, April/May 2013
It is not surprising that a metric created in the pre-digital world of the 1960s misses a lot of impact and usage. That failure does not make citation analysis inherently bad; it is still a useful tool. But, it does make it inadequate for a complete picture of the usage and impact both of research articles and other research artifacts. To create that complete picture, Plum Analytics studied all of the ways that research artifacts, from articles to videos and everything in between, are made available and used…
By capturing valuable metrics in all of these categories and creating a more complete representation of research and researchers, Plum is able to provide a more holistic picture than traditional citation analysis. While many will claim that these newer metrics are “alternative,” it is our position that all these metrics are anything but alternative. They are readily available, abundant and essential.
Read the full article
Library Journal, The Digital Shift by Matt Enis, Feb 5, 2013
Researchers have long contended with a problem with timeliness. Peer-reviewed articles are often published about a year after submission. Likewise, peer-reviewed articles that cite, praise, criticize, or discredit their work won’t appear for at least another year after that. As a result, there can be a lag of three to five years before citations begin offering enough information to indicate the effect that a given piece of research has had on a field. Even then, citations alone may not offer a complete view of the impact of that research.
“Measuring the impact of research through the traditional methods—counting citations in published literature—is important, but it doesn’t tell the whole story,” said Timothy Deliyannides, Director of the Office of Scholarly Communication and Publishing and Head of Information Technology for the University of Pittsburgh.
Read the full article
The Scholarly Kitchen by Judy Luther, July 25, 2012
The most recent entrant in this arena is Plum Analytics, founded by Andrea Michalek and Mike Buschman, who were team leaders in the successful development and launch of ProQuest’s Summon. Andrea is building a “researcher reputation graph” that mines the web, social networks, and university-hosted data to map relationships between a researcher, his institution, his work, and those who engage with it.
Read the full article
Information Today by Barbara Quint, June 28, 2012
According to Rush Miller, university librarian and director at the University of Pittsburgh, the Plum service will “work in tandem with traditional measures to assess the impact of Pitt research in non-traditional venues. These days scholars are no longer waiting to publish their research in formal publications. They’re using Twitter, social networks, blogs, etc. to publish research and thoughts as they occur. Plum will match Pitt’s researchers to their own database.”
Read the full article
Library Journal, The Digital Shift by Michael Kelley, May 31, 2012
Two prominent veterans of the library vendor world recently launched a startup company which aims to capitalize on the rapidly flowering field of altmetrics… Altmetrics (short for alternative metrics) provides a new way to measure the impact of scholarly communication. Rather than rely solely on the traditional and slow measure of citations in peer-reviewed articles (the impact factor), altmetrics provides a complementary, instant measurement window that includes all Web-based traces of research communication. It pulls together all the usage data about each individual output a researcher has produced.
Read the full article
Guest editorial by Plum co-founder, Mike Buschman, in UKSG eNews April 13, 2012
Academic libraries are inherently involved in the research creation process as well as the procurement and collection of research. Thus, they are uniquely positioned to affect change in order to provide science with more timely, open, and modern ways of scholarly communication.
Read the full article