Two recent reports consider the long-term impact of the UK’s Sure Start programme, which aimed to narrow the gap between children from lower- and higher-income families. As we look to the future, Naomi Eisenstadt, Director of the Sure Start Unit for its first six years, offers four lessons that policymakers interested in early years support services can take from the programme.
Sure Start was a major early years programme in the UK that was launched in 1999 with the aim of narrowing the gap in outcomes between children from low-income families and their better-off peers. The programme grew to cover more areas and more activities up until 2010, after which point funding was steadily reduced and many Sure Start activities ceased operation.
The Institute for Fiscal Studies recently published reports which highlight positive medium- and long-term health and educational outcomes for children living within a short distance of a Sure Start Centre. A third report, due this summer, will report on outcomes for children’s social care needs and involvement in crime. At this point, based on everything we know about the impact of the programme to date, we can ask: what are the lessons for policymakers interested in early years initiatives today?
How Sure Start worked – and evaluations to date
A brief overview of the programme helps set the scene. Announced in Parliament in 1998, Sure Start was an “area based” initiative: funding was allotted not through a competitive bidding process, but through an analysis of poverty levels at the neighbourhood level. The intention was to reach poor children not by family targeting, but by choosing areas with the highest concentration of children in poverty, and then providing a core set of early years services that would be available to all families in the catchment area with young children. A fundamental feature of the original design of Sure Start was that local programmes would be run by local boards including representatives from health, social care and education, as well as local parents. The Board would decide precisely what set of services were needed in the area in order to reach the goal of improving life chances for young children from low-income families.
From 2003 onwards there have been significant changes in the design and funding of Sure Start. The initial group of 500 local programmes was vastly expanded, promising a Children’s Centre in every locality. The initial 500 local programmes were rebadged as Children’s Centres and they continued to be generously funded. The vast expansion meant that new Centres from 2003 onwards got significantly less funding. From 2010 onwards, funding for the overall programme was eroded. The Stop Start report, published by the Sutton Trust in 2018, found that at least 1,000 Children’s Centres closed, and many more were a shadow of what had been; they were open fewer days offering fewer services and many became centres for children’s social care referrals with almost no open access services. A programme that promised a diverse range of family support services open to all within walking distance of home was decimated.
There have been two government-funded evaluations of Sure Start, the National Evaluation of Sure Start (NESS) headed up by Professor Ted Melhuish and Professor Jay Belsky, and the Evaluation of Children’s Centres in England (ECCE) led by Professor Pam Sammons and Professor Kathy Sylva. The NESS study looked mainly at the original 500 Sure Start Centres and ECCE looked at a mix of fully-funded centres and some whose funding was reducing. Both came up with similar conclusions. The major impact of Sure Start was largely on parents and parenting behaviours, including improvements to the home learning environment, less harsh parenting and less home chaos. ECCE was able to track use of services and found better results in families that used more services – and that centres experiencing reduced funding got poorer results. Both studies found no differences in social or cognitive development between children living in an area with access to a Sure Start Children’s Centre and those where there was no centre.
The two recent IFS studies cited above show significant differences between children living in areas with access to the original model of Sure Start Centres and those without access. The health impacts of Sure Start found increased hospitalisations in the first year of life, followed by decreases in hospitalisations well into middle childhood. The education findings include better GSE results, fewer demands for SEND (special educational needs and disability) support, and fewer demands for Education and Health Care Plans. Children from low-income families and minority ethnic families are shown to be the main beneficiaries of Sure Start.
What can we learn from all of this?
Four lessons from the Sure Start programme
First, the investment in Sure Start eventually paid off, but it took significantly longer than we had expected. It is likely that when the third report on children’s social care is published, the overall savings will considerably outweigh the original investment. However, this is clearly a slow burn: the savings are accrued over a number of years and are unlikely to show up as savings from any single public service or government department. Investment in young children continues to be a hard sell in straightened economic times.
Second, it makes sense to locate services in areas where those who will benefit most have easy access. However, this model clearly works well for cities and towns with concentrations of low-income families and can leave low-income families in rural areas without adequate support. Perhaps the ongoing development of parenting apps and other digital means of support will help to fill the gap for families in rural areas. As recommended in the recent Nesta report that I co-authored with Professor Sylva, placing additional support in primary schools in rural areas also helps to ensure access.
Third, gathering data and ongoing evaluations that allow for course correction are critical. Data is essential to keep track of who is using the services and, more importantly, who is not – facilitating assertive outreach. For example, the NESS evaluation showed poor results, early on, for the children of teen parents. This information triggered course corrections to ensure teen parents were specifically catered for thereafter.
The final lesson is somewhat more nuanced. We continue to look for objectively-measured, “evidence-based” interventions to improve outcomes for families and children – while also involving local parents in decisions around what would be on offer (this was a requirement for Sure Start centres). My experience is that families struggling to meet essential needs rarely ask for the kind of services that we think they need. Sure Start centre managers had considerable freedom in what would be on offer and there was adequate funding to offer both what parents asked for and what local managers judged would be helpful for their children. Indeed, the very best Centres managed to build trust by responding as far as possible to local demands, while also encouraging participation in activities that had been empirically shown to improve children’s life outcomes.
The key principle for open access services is an offer that parents like and which also helps improve their children’s readiness for school. Weaker Centres sometimes got the balance wrong, putting community development ahead of the core aim of improving outcomes for children. Given the level of freedom for a variety of approaches, the positive results shown by the IFS reports are even more impressive.
Striking the balance: what seems to be needed versus what parents want
Should we go back to Sure Start? Every Government wants to put its own stamp on initiatives and the actual name may not be helpful in this respect. What we have learned is that integrated family support services in low-income neighbourhoods with the right staff and the right level of funding can have a positive long-term impact on children. However it is done, we need to make such services part of the basic infrastructure of local public services.
The key factor that made Sure Start a success is that parents loved it. Open access services are not like school; parents don’t get punished if they don’t use the services. Hence they need to be attractive to parents if they are to participate. At the same time, parents enjoying the service does not guarantee better outcomes for children. It is essential to find delivery models that are both attractive to local parents who need the services and effective in improving the lives of their children. Alignment is essential between what the data suggest is needed, what interventions the evidence says are effective and what parents are willing to engage with. Considering these factors together will ensure a decent return on any future investment in early years services.
All articles posted on this blog give the views of the author(s). They do not represent the position of LSE Inequalities, nor of the London School of Economics and Political Science.