The LSE’s Professor Robin Mansell argues that more research is needed into documenting the necessity of the provisions in the Draft Communications Data Bill and whether they represent a proportionate response.
Launched 14 June, the Draft Bill is being characterised in the press as a ‘snooper’s charter’. This Bill deserves detailed pre-legislative scrutiny which it will receive. There is no doubt that data of the kind that the Government is seeking access to can be useful in police inquiry. But the assumption that because certain kinds of communication data are less available and harder to access because of rapidly changing communication technologies, it is necessary to open the door to a potentially massive increase in what data are collected and processed, is flawed.
The Government starts from the position that without the provisions in the Bill there is a risk that crimes will go undetected. If there is a risk then the magnitude of any such risk needs to be documented. It also needs to be set against the risk that those seeking to evade authorities will simply work harder in their efforts to do so – they will be given new incentives to step up their efforts to outwit the police.
Much is made in the Bill of safeguards and protections. A crucial consideration that needs to be examined is whether the government is right to be so confident that when the scale and scope of data collection and processing through automated filtering increases, the room for error will remain at an acceptable level. Where is the evidence that the new types of data collection and retention are a proportionate response to threats to citizen safety?
The government makes much of the growing complexity of technology – it needs to give the same consideration to the consequences of a step shift in the complexity of the human processes that are set out in the Bill and to the likelihood that mistakes will increase as will the problems for those whose lives are affected. The Draft bill is intended to be future proof in the face of further technical change, but on first inspection it opens a host of issues about whether the proposed Request Filters and the collation of fragmented data will work in the way that is described. It appears that those who develop automated data collection systems have given the government great confidence that they will not be prone to error. The whole history of software-based systems development in this and related areas suggests that confidence is misplaced. These problems need to be investigated by independent experts.
If it turns out that the Government is fomenting reasons for citizens to fear for their safety unless they accept these kinds of measures, then this is not responsible governance. The rhetoric of a ‘war’ on users of social media, online games, and mobile texting is reminiscent of the rhetoric of a ‘war on terror’ and there can be little doubt that this approach is far from being consistent with the principles of transparent democratic decision making.