In the second of a series of posts on the Impact of LSE Blogs project, Carlos Arrebola and Amy Mollett share the first findings of an LSE study that sought to examine the effects of blogging on the success of published articles. While the study proved to be more exploratory than explanatory, with the positive effects on citations particularly difficult to demonstrate conclusively, data does show that blogging enhances the overall attention paid to published research.
Social media has become a popular channel amongst academics for communicating their research, on the understanding that the open access and easy shareability of these new platforms will improve the dissemination of their work. A variety of studies have shown that social media platforms have a positive impact on published research, suggesting that the use of social media could help spread research further than would be the case if it had only been published in an academic publication and not reported via social media. For example, Shema et al showed evidence that blogging about research had a positive impact on its subsequent citation by comparing the median of citations of journal articles that had been blogged about with the median of articles that had not been blogged about. Similarly, Thelwall et al have shown that citation counts are “associated” with several social media platforms, such as Twitter or Facebook, and that Mendeley readership counts are positively correlated with citation.
In the context of the Impact of LSE Blogs project at the London School of Economics, these types of studies are of particular interest, as it may be that they can be used to measure to what extent LSE’s public-facing academic blogs increase the attention drawn to particular research outputs.
A key part of the academic blogging activity at LSE is reviewing or summarising academic work that has been recently published in journals or books. This is most obvious on LSE Review of Books, which has published daily reviews of the latest books in the social sciences since 2012. It is also very apparent on the LSE US Politics and Policy (USAPP) blog, where 40% of posts are based on published journal articles. In addition, many of these academic blog posts draw on and reference previous studies published in the literature, just as any academic commentary would do. For example, Altmetric identifies at least 1,960 research outputs mentioned by the USAPP blog; with 883 mentioned by the LSE Impact Blog.
The question then remains: does all this referencing and reviewing increase the success of the papers and books they mention? This is a question which our project has sought to address through our own investigations. Although there is an ongoing debate as to what constitutes success for an academic publication, there seems to be an agreement with (or at least a practice of) measuring success in terms of the attention a publication receives. Citation counts and alternative metrics (such as number of news mentions, Mendeley readers, or blog mentions) are considered as evidence of success.
For our study, a sample of blog posts, which had as their main purpose to review or summarise a journal article, was collected from the USAPP blog. Subsequently, citation data from Scopus and Altmetric Explorer data of the reviewed articles and their journals was put together to carry out a quantitative analysis.
First, we compared the median of citations to journal articles that have been reviewed on the USAPP blog with the median of citations to those journal articles that have not been reviewed on the blog, in order to see if there were any significant differences, following the methodology used in Shema et al (2014). Our sample included data from the 13 journals whose articles were most commonly reviewed on the USAPP blog throughout 2014. The sample contained 109 articles reviewed on the USAPP blog in 2014 and published that same year, and 1161 articles published in 2014 that had not been reviewed on the USAPP blog (n = 1270).
Of the 13 journals, six had a higher median of citations for the articles that had been reviewed on USAPP than those that had not; three had the same median; and for four the median was lower. However, these results were only significant for two of our sample journals (using a non-parametric test at the 5% level). This means that we can only consider our conclusions generally valid for those two journals, whereas for the other 11 journals any differences displayed might only exist in our sample.
The same approach was also attempted with a 2015 sample and a 2014 sample that eliminated outliers for articles that had been blogged about elsewhere but, again, the results obtained do not allow us to formulate general conclusions. Likewise, correlation coefficients and regression analysis did not yield significant results for our variable of interest (whether the study had been blogged about on the USAPP blog). Therefore, using the available data, it does not seem possible to conclude that blogging on the USAPP blog leads to significant differences to citations to the studies it reviews.
However, this does not mean that blogging about research published elsewhere is trivial. On the contrary, there is evidence in our sample to suggest that reviewing a published academic article on the USAPP blog has a positive impact on the attention that the research study attracts, even if we cannot conclude that such attention converts to significant citations differences. The USAPP blog attracts a constant stream of readers that read the review posts. In particular, our sample of blog posts has an average of 728 views per post in 2014, with some having been particularly successful in terms of readership. To give an example, a USAPP blog post summarising a study that examined celebrity support of political parties has been viewed more than 7,700 times. Attention to this particular research has clearly been enhanced by the blog post as, according to the publisher’s web page, the paper itself has only been downloaded 213 times since December 2016, compared with more than 1,500 USAPP blog post views in the same period (figures accurate early June 2017). Such enhanced attention drawn to research studies by blog posts obviously means the overall audience extends way beyond that which would be possible were the findings only made available in the original journal article.
Bearing these readership figures in mind, we also analysed the relationship of the USAPP blog posts with counts in other social media platforms. We calculated the Spearman correlation coefficients of the number of views of our sample of blog posts (obtained through Google Analytics) with the number of Mendeley readers that bookmarked the article that had been blogged about, the number of news outlets and blogs mentioning the original study, and the number of tweets mentioning the original study (data obtained through Altmetric Explorer). The correlation coefficients show a positive, albeit low, correlation for all of those variables. Nonetheless, the usual “correlation is not causation” disclaimer applies. We cannot be sure whether there are more readers of the posts on the USAPP blog because the original study has received attention elsewhere, or whether it has received attention elsewhere because it was blogged about on the USAPP blog, or whether there are other reasons that explain the relationships observed.
In summary, this sample study has proved to be more exploratory than explanatory. Although it has not provided us with any definitive answers as to whether there are significant differences to the citation rates of journal articles after they have been reviewed on one of our blogs, the different sources of data explored have provided us with some evidence that LSE blogs may contribute to increasing the attention of the studies they review. Most notably, for researchers that summarise their published research on one of our blogs it provides a platform that reaches a wide audience. Blogging about a study helps to communicate it in more general, easily understood terms and often eliminates the article paywall too. As a result, much more attention can be drawn to the study than if it is simply published in an academic journal.
Please join us at LSE on Wednesday 14 June to find out more about the findings of the research project and to hear a number of expert panellists from the Financial Times, The Economist, House of Commons Library, and Altmetric, among others, discuss the future of research communication. Sign up here!
Also, if you’re a PhD student or early career researcher interested in blogging about your research but unsure about where to begin, please join us for one of two workshops being held on Monday 12 June at LSE’s PhD Academy, run by LSE blog editors and LSE academics. “How to Blog Your Research” sessions are being held at 2:30pm-4pm and 4:15pm-5:45pm – sign up today!
Note: This article gives the views of the authors, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.
About the authors
Carlos Arrebola is Research and Blog Impact Officer for the Impact of LSE Blogs project, exploring the impact that LSE Blogs have on the academic community. He is also a research student at the University of Cambridge, where his work focuses on European Union and Competition law. Prior to that, he studied Law and Economics at Universidad Carlos III de Madrid. He sometimes tweets as @carrebola.
Amy Mollett is Social Media Manager at the London School of Economics and Impact of LSE Blogs Project Manager. She is co-author of the new SAGE book Communicating Your Research with Social Media. She previously managed several blogs at LSE, including LSE Review of Books and the Impact Blog. With her co-authors, Amy has won a Times Higher Education Award for Knowledge Exchange. Amy is a graduate of the London School of Economics and the University of Sussex. She tweets @amybmollett.