rs1991_leighdouglasmarshallBritish Social Attitudes, conducted by the National Centre for Social Research (NatCen), is the UK’s most established and authoritative survey of political and social attitudes. Leigh Marshall explains how the process of disseminating the survey findings has been modernized in recent years, how NatCen ensures the survey is interacting with the policymaking process, and how it’s possible to maximize its press and media coverage.

British Social Attitudes is the country’s most authoritative survey of political and social attitudes and the National Centre for Social Research’s (NatCen) flagship study. The press report on it, the public discuss its findings (on talk radio and I’m sure also in their homes) and politicians trust it. We can point to front pages and mentions during Prime Minister’s Questions (and First Minister’s Questions in Scotland). New legislation sometimes follows revelations of changes in attitudes uncovered by the survey.

But what makes this survey of public opinion more influential than others?

featuredImage taken from the British Social Attitudes website and published with permission.


You can’t talk about the survey without talking about the method. BSA is a face-to-face random probability sample survey. It is the gold standard for survey methodology; this is the survey that got the 2015 General Election result right. Its authority also comes from its longevity. The vision of Roger Jowell, NatCen’s founder, it was the first social survey in Britain to set out to chart shifting attitudes on an annual basis – most attitudinal surveys 30 years ago were conducted on an occasional, ad hoc basis. The survey’s 34-year-long series is unique and invaluable to social scientists and policymakers alike.

A modern publication

Only a few years ago the British Social Attitudes survey was a hardback book; today everything we publish is available freely available online. We produce infographics, an interactive data visualisation and publication is first announced on social media. To ensure our findings are timely and easily digestible, we have shortened our annual report and sped up the process – once the data is collected we turn around the report in six months, much faster than the 9-10 months it took only a couple of years ago (although I’m always pushing to get this done even quicker!).

Interacting with policy

We start from a solid foundation for impact – we have an authoritative and accessible study, with unique data on the issues that matter in British society. But how do we ensure that the survey is interacting with the policymaking process?

Sometimes they just come to us. Government departments often use the survey as a vehicle to test public attitudes to a policy. In recent years, the Department for Work and Pensions asked about attitudes to the benefits cap, while the Department for Communities and Local Government has used the survey to look at support for housebuilding. We also offer private briefings to government departments – whether they have funded questions or not. In recent years we have briefed DWP and DCLG on our findings, but also the Department for Business, Energy and Industrial Strategy and the Department for Exiting the European Union.

We like to get the research in front of as many parliamentarians as we can. We have launched the annual report at an event in Parliament (if an MP is really interested we’ll go and talk to them individually) and also held a separate event in Parliament that focused on our findings on attitudes to the European Union. Because of the EU’s importance it was the focus of quite a lot of our impact work on the last survey and we also held events on the EU at the party conferences in September 2015, nine months prior to the referendum.

Measuring the impact of this activity is not easy. We know that politicians are listening because they cite our data – when it suits them at least. Although this doesn’t always come to light; we only found out that the Prime Minister and his Director of Communications, Craig Oliver, had discussed some of our survey results in the run-up to the EU referendum when it was referenced in Mr Oliver’s post-referendum book. If the result had gone the other way, the book might never have been written.

And we know, as mentioned above, that departments will use BSA to test the popularity of a policy. Furthermore, we often see how evidence about public attitudes coincides with policy change, such as on attitudes to same-sex relationships. In 2000 the proportion of people saying that same-sex relationships are wrong fell below 50% for the first time. We subsequently saw the Labour government pass legislation that lowered the age of consent from 18 to 16 (2000), provided equal adoption rights (2002) and introduced civil partnerships (2004). By 2014 those opposed to same-sex relationships dropped to under 30%. The same year, the coalition government legalised same-sex marriage. It’s not easy to disentangle whether it is policy driving public opinion or vice versa (and there will have been other data sources – New Labour was known for its use of focus groups), but we do know that politicians don’t often like to go against the tide of popular opinion.

gay-couples-chart-01Figure 1: the y axis shows the percentage of survey respondents that felt same-sex relationships were “always wrong”

BSA in the press

We also put a lot of effort into making sure the research makes a splash in the press. This is partly because we see it as our role to reflect our findings back to the public who participate in our surveys. It also is a great way of generating debate and reaching policymakers – anyone who has worked closely with government ministers (or, like me, spent some time as a lobbyist) will tell you that they are paying close attention to what the papers say.

British Social Attitudes is a PR dream. Stats. Authority. Human interest. But getting in the media is still extremely competitive and if, like us, you are aiming for lots of column inches at the front of the paper and the comment pages (a nib on page 27 is not good enough) then you need to think carefully about how to do this:

Finding the angle – it’s not enough just to put your numbers out there, you need to be able to point out the wider societal or political implications. We involve friendly journalists early on – their perspective on what is newsworthy and what is not is helpful in shaping the story (and it increases the likelihood of them covering it when the time comes). These conversations can also lead to creating more exciting content – we’ve worked with the BBC on a quiz and the MailOnline on a data visualisation.

Tailoring content – we tailor our content to different outlets. We don’t just prepare a press release. We are ready with tables for data journalists, regional data for local press, we undertake extra analysis for key journalists (mini-exclusives), produce charts and graphics for online publications. And we give broadcasters plenty of notice and make sure our spokespeople are free to get up to Salford if they need to.

Spreading the risk – a busy news agenda can scupper months of hard work and planning. So we have moved away from a once-a-year model and now publish short reports and interesting statistics throughout the year. We still like that big annual NatCen moment, though, so to ensure a good airing we put out a teaser the week before and last year we also released some additional findings on the day itself to keep the story going.

As a comms professional it’s a real privilege to work on a survey like British Social Attitudes and I know that even without the efforts of me and my team the survey would be covered by the press and listened to by policymakers. But it’s no good if the people you want to reach have to go to the library to read your research and a thoughtful media launch can double your coverage. British Social Attitudes is as authoritative as ever, but today it’s a modern publication valued by policymakers, available to the public and uncovering our society’s thorniest and most important issues.

The next British Social Attitudes report will be published by NatCen in June 2017 and you can read all reports at Organisations can fund questions on the 2017 survey, or on the associated web panel, by contacting

Note: This article gives the views of the author, and not the position of the LSE Impact Blog, nor of the London School of Economics. Please review our comments policy if you have any concerns on posting a comment below.

About the author

Leigh Marshall is Head of Communications at the National Centre for Social Research (NatCen). His team is responsible for promoting NatCen’s research in the media and ensuring that it is used by policymakers. Leigh joined NatCen at the start of 2013 from a leading public relations consultancy. He holds a Masters in Communication from the University of Illinois.

Print Friendly