LSE - Small Logo
LSE - Small Logo

Blog Admin

December 10th, 2013

With altmetrics on the rise, the education community can capture insights into how pedagogy research is being used.

2 comments

Estimated reading time: 5 minutes

Blog Admin

December 10th, 2013

With altmetrics on the rise, the education community can capture insights into how pedagogy research is being used.

2 comments

Estimated reading time: 5 minutes

adele-squaremegan brooksThe number of citations an article receives is often used as an indicator of impact but there are many instances where this metric fails to capture the wider results of how ideas are distributed. Adele Wolfson and Megan Brooks look particularly at education-related publications and how alternative metrics could be utilised to count instances currently undervalued.

When those involved in traditional research publish papers, the main audience is other scholars in the same or related fields. Researchers hope the published work will be verified or refuted and built upon, and that new publications will result. The end product is the creation of new knowledge. So, it makes sense that the impact of the original publication would be measured as citations. (Even that conclusion is increasingly in doubt, but more on that, below.)

Image credit: Wikimedia (public domain)

When those involved in educational research publish papers, the audience includes other researchers, but is more often directed at educators who teach the subject matter in their classrooms. Pedagogical researchers hope their published work will be picked up in curricula, methods, and instructional approaches, and that more effective teaching will result. The end product is better student learning. Citations do not capture any measure of the use of the original publication, much less of the desired increase in student learning.

You can see the citation problem very starkly just by looking at impact factors for journals. In my own field of chemistry, one of the top U.S. venues for publication in all subfields is Journal of the American Chemical Society, which has an impact factor of 10.677. The education journal published by the same society, Journal of Chemical Education has an impact factor of 0.817.  The journals published by the Royal Society of Chemistry show the same trends: Chemical Society Reviews has an impact factor of 24.892, whereas Chemistry Education Research and Practice of 1.075. These patterns are repeated for traditional research vs. educational research in other fields.

I started thinking very naively about this issue and wondered what metrics other than citations might be used for education-related publications, not to mention results and ideas distributed in ways other than publication. Luckily I was directed to colleagues in Library and Technology Services, who introduced me to the relatively new field of altmetrics. Even for traditional scholarship, journal impact factors and the h-index for individuals have become too limiting. Research goes beyond the academy, appearing in articles in the popular press, testimony to government and non-government committees, patents, social media, etc. Altmetrics has arisen as a way to capture this impact, which is often observed more quickly and more widely than scholarly citation (summarized in this blog by Pat Loria in October 2013).  There are drawbacks to its methods, too, since many of these impacts are subject to manipulation and lack of peer review, but it does provide a much broader view of impact that is also not so dependent on publication venue as older measures. The field of altmetrics is becoming sufficiently mature that there are already committees and workshops to establish standards and recommended practices for the industry.

As we considered our original questions about impact of education research, we realized that even altmetrics as currently defined did not fully address our concerns. My colleagues and I wrote a commentary about this topic that described three approaches to raising the profile of education research – overall and for individual contributions – in hopes of generating discussion and inspiring new measures of impact.

One of these approaches is simple and easily accomplished, two much more complicated. The simple one is just to get education researchers and others involved in this type of pedagogical scholarship to use the full range of tools currently available to keep track of how their work is being used. In our commentary we listed a number of these and encouraged those in the education community to select at least one of these tools to get a sense of their own influence and standing. To measure impact using article-level citation counts, journal impact factor and/or h-index, we encourage the use of Web of Science, Scopus, or Google Scholar My Citations, and for measuring article-level impact using altmetrics, leaders include ImpactStory, Plum Analytics, or Altmetric.com.

The next approach is more difficult in that it requires that more members of the education research community engage with social media. Since altmetrics algorithms take into account blog mentions, saves/shares in Mendeley and CiteULike, tweets, Facebook postings, and so on, we need to see more of discussion of education-related topics in these venues so that they are “counted” more in determining impact. Furthermore, we ask that instructors post the sources of their materials on syllabi on websites so that they are accessible to the public, not just on handouts and portals limited to their own classes and institutions.

The final approach is the hardest since it means getting the attention of altmetrics companies and convincing them to expand the range of sites they monitor for inclusion. We would like to see a list created of venues where educational materials appear and to work with altmetrics developers to incorporate these as appropriate. What sorts of venues are we talking about? Textbooks, presentations at workshops, sites for test preparation materials, test banks and syllabi repositories, funded grant proposals, and others might be included. But the first step would be convening some meetings, probably centered around education research in specific areas (science, writing, mathematics, etc.) that would include researchers, practitioners, and altmetrics representatives to define what would be most useful to everyone.  If we don’t do this now, when altmetrics is still developing, we will soon be in the same position as we are now, where education research is undervalued because its impact undercounted.

Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below.

About the Authors

Adele Wolfson is the Nan Walsh Schow ’54 and Howard B. Schow Professor in the Natural & Physical Sciences and Professor of Chemistry at Wellesley College. She received her A.B. in chemistry from Brandeis University and her Ph.D. in biochemistry from Columbia University and has worked or studied in Israel, France, and Australia. Her scientific research is in the area of protein biochemistry, particularly the role of neuropeptidases in reproduction and cancer. She also conducts research on educational and pedagogical topics, including concept inventories for biochemistry, and the ways that students connect learning between science and non-science courses.

Megan Brooks is the Director of Research Services for Wellesley College’s Library and Technology Services, where she has the pleasure of working with and managing a team of creative and thoughtful research librarians, and supporting faculty and highly motivated and intelligent students. She earned her B.A. in psychology from the College of St. Benedict and her M.L.S. from Syracuse University.

Print Friendly, PDF & Email

About the author

Blog Admin

Posted In: Academic communication | Impact

2 Comments