Knowledge and awareness of research impact-related concepts and terminology varies greatly among researchers around the world. To help researchers test their “research impact muscles” and see how they compare to their peers, Kudos have developed a circus-themed quiz! Lauren McNeill explains how this quick, light-hearted quiz will help researchers increase their understanding of how to maximise the likelihood of their research being found, read, cited, and applied by a broader audience. Early game data reveals some interesting findings, with more than half of researchers claiming not to have received training to help them increase the impact of their research, and only a third reporting that they consider how to maximise the impact of their research prior to their project beginning.
Over the last five years, we’ve spoken to hundreds of researchers about their practices and needs in relation to dissemination and impact. As you’d expect, there is a wide range in terms of people’s awareness and knowledge. Even in the UK, where there has been a strong focus on impact and some excellent training is available, many academics remain unfamiliar with impact-related concepts and terminology.
In this context, we developed a light-hearted quiz to help researchers test their research impact “muscles” and compare themselves to other researchers; in terms of how much they know about research dissemination and impact, and the effectiveness of their current efforts. The circus-themed quiz acts as a starting point to help researchers increase their understanding of how to maximise the likelihood of their research being found, read, cited, and applied by a broader audience.
The six questions aim to address some of the keys areas which contribute to the overall reach and impact of research, and will hopefully provide researchers with some vital tips and insights into other channels/ways to increase their impact.
At the end of the quiz each participant receives a “research impact score” and tips, which, although light-hearted, will hopefully help the researcher identify areas for improvement – or, alternatively, provide them with the reassurance that they are actually doing well when it comes to increasing the impact of their work. There are randomly selected prizes for both individuals and institutions.
The quiz is not designed as a research instrument and is for individual guidance rather than aggregate analysis, but, with those caveats in mind, it’s interesting to look at the responses so far:
- 51% of participants claim not to have received training to help them increase the impact of their research, while 44% stated they had. (How does this compare to institutional provision of impact training? Comments welcome!)
- Only 33% of participants start thinking about how to maximise the impact of their research prior to the project beginning. 18% started to think about impact during their research project; 17% whilst preparing to disseminate outputs; 23% post-publication; and 9% hadn’t thought about the impact of their work at all!
- 48% are focused on “conceptual impact” (building evidence, knowledge, and awareness). Only 26% are focused on “instrumental impact” (changing policy, behaviour, or practice); and 23% are focused on “capacity building” (building skills, expertise, and jobs).
- Academic networks (ResearchGate and Academia.edu, for example) are most popular for increasing awareness of research. This was closely followed by the use of identity systems such as ORCID, Scopus ID, and ResearcherID, email, and social media (Facebook, Twitter and Linkedin). The use of press releases to increase awareness of published work was the least popular channel.
- Researchers in the medicine and medical sciences fields scored most highly, closely followed by those in the electrical engineering and chemistry fields. Some of the subject fields featuring less in the top-scorer ranking are sociology, civil engineering and construction, and tourism, hospitality and events.
With 6,000 participants so far, from all over the world – largely in Italy, United States, United Kingdom and India – game data suggests that the more widely the “impact agenda” is adopted, the greater the gap that emerges in terms of planning and managing impact, rather than simply trying to measure it. For us, this means taking our service “upstream” so that we can help people plan and manage communications around their project from a much earlier stage (rather than focusing at the post-publication phase as we do currently).
You can “Test Your Research Impact Muscles” at kudosimpactgame.com
This article gives the views of the author, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.
About the author
Lauren McNeill is Marketing Manager of Kudos, which helps researchers, publishers and institutions to maximise the reach and impact of their research. As a passionate digital marketer, she is on a mission to make research more accessible by encouraging and demonstrating (to researchers) the opportunities available when disseminating research via online channels such as social media. She tweets @laurenemcneill.