LSE - Small Logo
LSE - Small Logo

Mariya Stoilova

Sonia Livingstone

September 17th, 2019

How do we know it’s effective? Approaches to measuring child outcomes from helpline services

0 comments | 3 shares

Estimated reading time: 10 minutes

Mariya Stoilova

Sonia Livingstone

September 17th, 2019

How do we know it’s effective? Approaches to measuring child outcomes from helpline services

0 comments | 3 shares

Estimated reading time: 10 minutes

LSE’s Mariya Stoilova, Professor Sonia Livingstone, and Sheila Donovan worked with the UK-based NSPCC to investigate how children’s helplines measure their effectiveness.

The work of child helplines is vital for reducing children’s risk of harm and vulnerability, and improving child protection and wellbeing. It is estimated that globally child helpline services take around 9 million calls annually. Child helplines are recognised as a key component of child protection services that contribute to creating accessible and child-friendly reporting systems and can help to ensure the implementation of child rights. Understanding the evidence regarding the effectiveness of child helplines can inform improvements to helplines and related child protection mechanisms. In collaboration with NSPCC, we carried out a systematic evidence mapping, comprising an extensive search of 18 databases, which reviews the existing literature on the effectiveness of child helplines with a particular focus on identifying helpline outcomes and approaches to measuring effectiveness.

The findings report, launched today, is available here. It highlights a number of positive outcomes and challenges:

  • Key positive outcomes include a competent, efficient and approachable service: the evidence demonstrates that helplines can achieve improvements in relation to a range of child outcomes, such as children’s wellbeing, self-confidence, levels of anxiety and distress, and ability to deal with their current situation. Help-seekers sometimes put more emphasis on emotional support, such as being listened to and understood, than on problem-solving. Children value being able to talk to professionals who are knowledgeable and can be trusted, being able to discuss issues of concern, and feeling welcome.
  • While the overall effects are generally positive, helpline effectiveness varies depending on the outcomes measured: fewer effects are observable in relation to skill development or changes in individual, family and social behaviour and practice. This may be due to the nature of the service offered or to the methodological challenges in capturing these effects. Not all users see the same positive results – some experience no effect or even negative outcomes. The children who are worse off after using helpline services are usually those who are unable to achieve the outcomes they expect, who are unhappy with the advice and help offered or who find that, while the advice is good, it is difficult to follow. Setting realistic and measurable helpline outcomes that capture the service performance can help to identify gaps and improve support, particularly for the children who experience the least benefits.
  • There is an extended helpline support via digitally-mediated services which offers distinct benefits: online services are usually seen as more accessible (for those with internet access) as they have no geographical or time boundaries. Arguably, for some they can offer greater emotional safety and security due to the reduced emotional proximity to the counsellor, can provide greater privacy and anonymity, and a better opportunity to plan the interaction. This makes online support preferred among children who feel uncomfortable contacting a telephone helpline, particularly those with more complex and emotionally charged issues, LGBTQ youth, or those struggling with mental health issues or with a speech impairment. However, particular challenges arise from online communication, including the difficulty of recognising distress due to the lack of non-verbal and paralinguistic information (expression, voice, intonation, pauses), the need to employ compensatory strategies (e.g., emoticons, expressing emotions in text, discussing misunderstanding if it occurs) and digital inequalities related to access and skills.
  • The main strength of the evidence base is that it represents children’s concerns and issues in a comprehensive way: most child helpline evaluations provide a good overview of the key issues children face, often drawing on case studies that vividly document children’s experiences. Some helplines particularly prioritise child representation, advocacy and rights and others provide ‘on-demand’ child support and evidence in response to a particular socially relevant issue. These efforts to provide evidence of the issues of concern to children themselves are important as they can help to open up the decision-making process and ensure children’s issues are represented at policy level.
  • Circulation of good practice: effective collaboration within the sector – with referral agencies, educational and governmental institutions and the wider public– is often mentioned as key to effective child protection. Still, there is little evidence related to effects occurring at the level of family, community, cross-agency networks or society as a whole. Given substantial efforts to create change at all these levels, these evidence gaps are most likely the result of methodological difficulties to attribute such effects. Addressing these challenges may require collaborative evaluations, along with tracking longer-term impact and even establishing evaluation standards across the sector.
  • Helplines generally strive to monitor their performance and use evidence to improve their support to children: a wide range of evidence-gathering and evaluation methods exists, from small-scale micro studies to more comprehensive multi-method designs, and from single-point ‘snapshots’ to comparative design studies with a longitudinal focus. While some studies rely on theoretically and empirically informed designs and map the organisational aims onto the service outcomes, many have a patchier approach and do not always draw on clearly defined outcomes in their assessments.
  • The helpline sector faces substantial challenges in evaluating its effectiveness: this is because of both the struggles of individual helplines (often small organisations) to carry out comprehensive and robust evaluations and the lack of established and recognised models and outcome measures within the sector. Other challenges reflect the nature of the work itself: (1) the brief and confidential contact poses challenges to measuring outcomes and following up with service users, making particularly long-term effects hard to establish and demonstrate; (2) better outcomes data often means more intrusive research methods, which may be unsuitable for children – especially those who have self-identified as needing helpline support; (3) demonstrating the direct link between positive outcomes and helpline support is difficult in the light of the diversity of factors (behavioural, psychological, socioeconomic, use of other services) that may influence child outcomes; (4) the prevention of negative effects is particularly hard to measure and demonstrate as is impact change in social trends (e.g., in the incidence of child sexual abuse, violence, bullying, suicide, etc.).

In spite of these challenges the review outlined a number of good practice examples and approaches to overcoming barriers to effectiveness. Key lessons include:

  • Using multiple measures and indicators: one of the strengths of the evaluations included in the review relates to the use of mixed methods to demonstrate the effectiveness of the service. Given the sensitive settings in which helplines operate, the number of issues raised and the various channels of support that are used at present, combining a range of methods, including both quantitative and qualitative measures, can provide stronger evidence.
  • Quality of evidence: it might be beneficial to distinguish between ‘hard’ and ‘soft’ outcomes and to use different types of evidence to demonstrate effectiveness. Using a combination of some less robust quality data that a helpline can gather longitudinally and a single high-quality evaluation can offer an effective approach to demonstrating effectiveness.
  • Using standardised measures: the use of standardised measures with established reliability, validity and applicability to children is a good way of producing robust evidence of effectiveness.
  • Mapping aims, outcomes and measures: having a clear model of how the organisational aims match the desired outcomes and how the evidence can support the demonstration of effectiveness is crucial.

For more details and case studies, read the full report published by NSPCC as part of the Impact and Evidence series.

This article represents the views of the authors, and not the position of the LSE Media Policy Project, nor of the London School of Economics and Political Science. 

About the author

Mariya Stoilova

Dr Mariya Stoilova is a Post-doctoral Research Officer at the London School of Economics and Political Science (LSE) and an Associate Lecturer in Psychosocial Studies at Birkbeck, University of London.

Sonia Livingstone

Sonia Livingstone OBE is Professor of Social Psychology in the Department of Media and Communications at LSE. Taking a comparative, critical and contextual approach, her research examines how the changing conditions of mediation are reshaping everyday practices and possibilities for action. She has published twenty books on media audiences, media literacy and media regulation, with a particular focus on the opportunities and risks of digital media use in the everyday lives of children and young people. Her most recent book is The class: living and learning in the digital age (2016, with Julian Sefton-Green). Sonia has advised the UK government, European Commission, European Parliament, Council of Europe and other national and international organisations on children’s rights, risks and safety in the digital age. She was awarded the title of Officer of the Order of the British Empire (OBE) in 2014 'for services to children and child internet safety.' Sonia Livingstone is a fellow of the Academy of Social Sciences, the British Psychological Society, the Royal Society for the Arts and fellow and past President of the International Communication Association (ICA). She has been visiting professor at the Universities of Bergen, Copenhagen, Harvard, Illinois, Milan, Oslo, Paris II, Pennsylvania, and Stockholm, and is on the editorial board of several leading journals. She is on the Executive Board of the UK Council for Child Internet Safety, is a member of the Internet Watch Foundation’s Ethics Committee, is an Expert Advisor to the Council of Europe, and was recently Special Advisor to the House of Lords’ Select Committee on Communications, among other roles. Sonia has received many awards and honours, including honorary doctorates from the University of Montreal, Université Panthéon Assas, the Erasmus University of Rotterdam, the University of the Basque Country, and the University of Copenhagen. She is currently leading the project Global Kids Online (with UNICEF Office of Research-Innocenti and EU Kids Online), researching children’s understanding of digital privacy (funded by the Information Commissioner’s Office) and writing a book with Alicia Blum-Ross called ‘Parenting for a Digital Future (Oxford University Press), among other research, impact and writing projects. Sonia is chairing LSE’s Truth, Trust and Technology Commission in 2017-2018, and participates in the European Commission-funded research networks, DigiLitEY and MakEY. She runs a blog called www.parenting.digital and contributes to the LSE’s Media Policy Project blog. Follow her on Twitter @Livingstone_S

Posted In: Children and the Media | LSE Media Policy Project

Leave a Reply

Your email address will not be published. Required fields are marked *