Mendeley is emerging as a leading source of data on how ideas spread and which academics are the most widely read and influential in their respective fields. At Altmetrics12, a gathering of the leading researchers studying how social networks and the web are changing research, several researchers presented papers examining how Mendeley’s readership data compares with traditional research. This research provides independent third-party validation of Mendeley’s research stats and enables developers to create discovery tools to service the needs of many different types of research consumers. How do you use altmetrics? Take our survey!
Jason Priem, Heather A. Piwowar, and Bradley M. Hemminger presented a paper on Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact. For this research, the authors looked at the emerging role of social media in scholarly communications, studying social bookmarks, blog posts, Twitter, Facebook, Wikipedia, and many other sources and comparing them to one another to see what sorts of usage patterns might emerge. They found that the different types of indicators of use were broadly correlated, but independent, with Mendeley readership of PLoS papers being the only social metric that was significantly correlated to citation patterns. Facebook was down at the bottom of the stack, putting yet another nail in the coffin of the “Facebook for Research” idea. Interestingly, they also showed that patterns of use of different social media relate to different types of research consumption activities, suggesting that their are distinct subcategories withing research, for example a group of papers that might be widely shared but not often cited (such as this paper, which had very high pageviews for June, but relatively few adding it to their Mendeley library) and a group that is often cited but little read, such as the published versions of popular Arxiv preprints.
Judit Bar-Ilan also presented a study of Mendeley’s coverage of JASIST, a premier information science journal. Not only did Mendeley’s coverage of that journal rival that of Scopus, Web of Science, and Google Scholar(it’s been previously reported that our coverage of Nature and Science is similarly high), but was correlated to WoS and Scopus citations for articles in JASIST. As with the Priem et al. study above, there’s significant activity reflected in the readership stats that’s not captured in the citations reported by WoS and Scopus.
Overall the conference was a fascinating collection of people doing research upon the research process itself and there were great discussions lasting well into the night. Richard Price of Academia.edu asked if anyone knew to what degree these metrics, relative to the widespread use (and abuse) of the Journal Impact Factor, had penetrated into the decision making processes of grant reviewers and tenure committees, so I though this would be a perfect opportunity to survey our community of researchers on this question.
If you’ve been on a review board or tenure committee, would you please take our 5 question survey? It takes less than a minute to complete.