On 1st January 2020, police in the Indian state of Himachal Pradesh installed over 19,000 CCTV cameras to form a “CCTV Surveillance Matrix”; it forms the basis of a predictive policing strategy (Outlook India, 2020). The police aim to install 68,000 CCTV cameras in this matrix, one for every 100 people. Himachal Pradesh is not the first Indian state to implement predictive policing, as police in Delhi, Telangana, and Jharkhand already have full-fledged predictive policing systems (Singh, 2020; Kodali, 2017; Kumar, 2012). Predictive policing systems help the police be more proactive rather than reactive in their approach. While it has its supporters, I argue that they are ineffective, discriminatory, and violate privacy rights.
Understanding predictive policing
Predictive policing is defined as the use of algorithms to analyze massive amounts of information to predict and help prevent potential future crimes (Lau, 2020). It involves feeding massive amounts of data to advanced algorithms in order to identify recurring patterns in criminal behavior. The rationale is that if the pattern exists, crime is bound to follow. For example, Delhi’s CMAPS (Crime Mapping Analytics and Predictive System) collects data every 3 minutes from the Indian Space Research Organization’s satellites, historical crime data, and the ‘Dial 100’ helpline to identify ‘crime hotspots’ which are crime-prone areas. Hyderabad police go a step further as they use sensitive data taken from the ‘Integrated People Information Hub’. The Hub contains family details, biometric details, passport details, address, and even bank transaction details, etc. to determine those who are more likely to commit crimes (Umanadh, 2019).
Predictive policing methods can be divided into four categories: those which predict crimes; those which predict offenders; those which predict the perpetrator’s identities; and those which predict victims (Perry et al., 2013). Many of such methods utilize Artificial Intelligence (AI).
Does it even work?
There is convincing evidence that proves predictive policing algorithms are ineffective. The Chicago police used predictive policing algorithms to identify 426 people who were the most at risk of being homicide victims in the year 2013-14. Subsequently, it was found that only 3 of the 405 homicide victims in that period were on that list (Saunders, 2016). After implementing a predictive policing system, Pasco County in Florida observed no relative decrease in property crimes but saw a relative increase in violent crimes when compared with nearby counties (McGrory and Bedi, 2020). Predictive policing systems made by PredPol, one of the largest companies in the market, were discontinued by Palo Alto and Rio Rancho police departments in the United States as they were ineffective (Puente, 2019). The outcome of a trial of a predictive policing system in Berlin was poor (Well, 2020).
The reason why predictive policing algorithms fail is that relying on data alone leads to a misunderstanding of causal relationships. For example, take these American police officers’ comments that marijuana shouldn’t be legalized as 54% of violent offences are committed by those who were under the influence of the drug (Minnesota Public Radio News, 2013). The comment is implying that marijuana is the cause of the violent offence, although the reverse may be true – that violent offenders are more likely to use marijuana. It is also possible that those who committed violent crimes under the influence of the drug were more likely to be caught. Thus, from the above statistic, it can only be inferred that there is a correlation between marijuana use and violent crime, and not causation. But predictive policing algorithms jump this gap and equate correlation and causation. If the aforementioned statistic is fed to the algorithm, it will conclude that marijuana use is a risk factor for violent crime, which has been disproved over and over again (Green et al., 2010; Office of National Drug Control Policy, 2013).
Institutionalizing discrimination
Predictive policing algorithms legitimize discrimination by hiding it behind the façade of mathematical analysis. Historical criminal data, upon which predictive policing algorithms rely, isn’t necessarily reflective of who is more likely to commit a crime; rather, it is an indicator of who is more policed. The Indian police are notorious for being casteist and communal (Darapuri, 2020), although this is probably only a reflection of the attitudes of Indian society. Lower castes and religious minorities have historically faced systematic discrimination and violence despite constitutional protections (Kishore, 2016; (Center for Study of Society and Secularism & Minority Rights Group International, 2017). The ruling BJP’s brazenly Hindu nationalist and upper-caste agenda is reversing decades of (albeit minimal) social progress (Gettleman et al., 2019).
Members of minority religions and lower castes are more likely to be in the crosshairs of law enforcement agencies, even if there is definitive proof of innocence. For example, in Ankush Maruti Shinde v. State of Maharashtra ((2019) 15 SCC 470), six men belonging to the marginalized Paradhi community spent sixteen years on death row in solitary confinement until the Supreme Court of India acquitted them. The police presumed them to be guilty simply because they belonged to the Paradhi community. This was despite the fact that an eyewitness had identified four other men from the rogue’s gallery as the actual offenders. The fact that Muslims, Dalits, and Adivasi communities, who are amongst the most vulnerable sections of Indian society, constitute over half of the undertrial prison population as compared to their 39% share in the general population demonstrates the discrimination present in the Indian criminal justice system (Thakur and Nagarajan, 2020).
Similar patterns of discrimination exist in predictive policing systems. Mathematicians in the United States have urged their colleagues to stop working on predictive policing systems as they believe that it perpetuates structural racism (Linder, 2020). In the United States, racial minorities, in particular African-Americans, are more policed than white people due to structural racism in the policing system (Willingham, 2019). This leads to racial bias in crime data, which is used by the predictive policing algorithms. This creates a discriminatory feedback loop – the more a certain group is policed, the more likely the algorithm believes that a member of that group is a potential criminal, which in turn leads to more policing of that group.
Research demonstrates a similar phenomenon is happening with Delhi’s CMAPS (Marda and Narayan, 2020). They explain that the algorithm only reinforces the biases of the police officers. As a result, areas, where caste and religious minorities constitute the majority, are disproportionately more likely to be targeted by the police, and thus, create a discriminatory feedback loop in the algorithm. This is in line with the results of a study commissioned by the UK government which indicates that police officers deployed in ‘crime hotspots’ identified by data analytics are more likely to arrest people due to bias and not probable cause (Babuta and Oswald, 2019).
Violating the Right to Privacy
As vast amounts of personal data are used by predictive policing algorithms, I must raise the effect they have on the right to privacy. The opacity regarding the use of personal data is a violation of the right to privacy as envisioned in Justice K.S. Puttaswamy (Retd.) v. Union of India ((2017) 10 SCC 1). Also, the landmark judgement propounded the ‘proportionality and legitimacy test’, which established four criteria that must be fulfill ed before the state can infringe one’s right to privacy:
- The action must be sanctioned by law
- The proposed action must be necessary in a democratic society for a legitimate aim
- The extent of such interference must be proportionate to the need for such interference
- There must be procedural guarantees against abuse of such interference
Criterion #1 is not fulfilled since there is no objective legislation that governs the use of predictive policing algorithms. Criterion #4 is not fulfilled as well. I contend that criterion #2 is not fulfilled since there is ample evidence that suggests that predictive policing algorithms are ineffective and discriminatory, and thus, is not necessary in a democratic society for a legitimate aim, which is to lower crimes. There are plenty of ways to reduce crime that are effective and sustainable (Larsson, 2015).
Furthermore, as law enforcement authorities enjoy exemptions under the Right To Information Act, 2005, the actual machinations of predictive policing algorithms cannot be ascertained. Due to such opacity, many like myself suspect that predictive policing is state surveillance hiding behind the smokescreen of internal security. These concerns are well-founded given the fact that India’s National Intelligence Grid (NATGRID), a central intelligence database, is expected to receive access to citizen’s personal data like bank account details (The Hindu, 2020). I am also concerned about the security of the databases containing personal data, given last year’s hack of Maharashtra’s Criminal Investigation Department website (Mengle, 2020).
Conclusion
At a time in which structural discrimination and lack of police accountability are being widely discussed due to the Black Lives Matter movement, I feel that India’s authoritarian regime is increasing both by implementing predictive policing algorithms. The lure of preventing crime is too good to resist for law enforcement agencies, despite ample evidence showing that is ineffective. Predictive policing algorithms are only as good as the data they utilize, and if this data is biased, it institutionalizes discrimination against minorities. It may also be that predictive policing algorithms are used by the government to keep tabs on its citizens. In light of all these issues, I believe that citizens must lobby for the scrapping of predictive policing systems. The constitutional validity of predictive policing algorithms can be challenged by relying on the authorities in Madhu v. Northern Railways (247 (2018) DLT 198) and the landmark American case Griggs v. Duke Power (401 U.S. 424 (1971)), which both hold that policies that are neutral on paper but discriminatory in practice (like predictive policing) violate the right against discrimination.
Given the rise of AI, the government should enact an algorithmic accountability law, something akin to a better-implemented version of New York’s Local Law 49 (Lechner, 2019).
While discontinuing predictive policing systems will reduce the discriminatory treatment meted out to minorities, it does nothing about the already existing biases prevalent in the Indian criminal justice system and society at large. Whilst purely legal solutions cannot eliminate discrimination, the least the government can do is legislating a comprehensive anti-discrimination law to strengthen the Constitutional right against discrimination.
References
Babuta, A. and Oswald, M., 2019. Data Analytics and Algorithmic Bias in Policing. [online] London: Royal United Services Institute. Available at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/831750/RUSI_Report_-_Algorithms_and_Bias_in_Policing.pdf [Accessed 6 March 2021].
Center for Study of Society and Secularism & Minority Rights Group International, 2017. A Narrowing Space: Violence and discrimination against India’s religious minorities. [online]
Minority Rights Group International. Available at: https://minorityrights.org/wp-content/uploads/2017/06/MRG_Rep_India_Jun17-2.pdf.
Darapuri, S., 2020. The Police in India Is Both Casteist and Communal. The Wire, [online] Available at: https://thewire.in/caste/police-casteist-communal [Accessed 5 March 2021].
Gettleman, J., Schultz, K., Raj, S. and Kumar, H., 2019. Under Modi, a Hindu Nationalist Surge Has Further Divided India. The New York Times, [online] Available at: https://www.nytimes.com/2019/04/11/world/asia/modi-india-elections.html [Accessed 5 March 2021].
Green, K., Doherty, E., Stuart, E. and Ensminger, M., 2010. Does heavy adolescent marijuana use lead to criminal involvement in adulthood? Evidence from a multiwave longitudinal study of urban African Americans. Drug and Alcohol Dependence, [online] 112(1-2), pp.117-125. Available at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2950879/.
Kishore, R., 2016. The many shades of caste inequality in India. mint, [online] Available at: https://www.livemint.com/Politics/ino3tfMYVsd6VVGUdWXB8H/The-many-shades-of-caste-inequality-in-India.html [Accessed 5 March 2021].
Kodali, S., 2017. Hyderabad’s ‘Smart Policing’ Project Is Simply Mass Surveillance in Disguise. The Wire, [online] Available at: https://thewire.in/government/hyderabad-smart-policing-surveillance [Accessed 5 March 2021].
Kumar, R., 2012. Enter, the future of policing – Cops to team up with IIM analysts to predict & prevent incidents. The Telegraph India, [online] Available at: https://www.telegraphindia.com/jharkhand/enter-the-future-of-policing-cops-to-team-up-with-iim-analysts-to-predict-prevent-incidents/cid/390471 [Accessed 3 March 2021].
Larsson, N., 2015. 24 ways to reduce crime in the world’s most violent cities. The Guardian, [online] Available at: https://www.theguardian.com/global-development-professionals-network/2015/jun/30/24-ways-to-reduce-in-the-worlds-most-violent-cities [Accessed 6 March 2021].
Lau, T., 2020. Predictive Policing Explained. [online] Brennan Center for Justice. Available at: https://www.brennancenter.org/our-work/research-reports/predictive-policing-explained [Accessed 5 March 2021].
Lechner, C., 2019. New York City’s algorithm task force is fracturing. The Verge, [online] Available at: https://www.theverge.com/2019/4/15/18309437/new-york-city-accountability-task-force-law-algorithm-transparency-automation [Accessed 6 March 2021].
Linder, C., 2020. Why Hundreds of Mathematicians Are Boycotting Predictive Policing. Popular Mechanics, [online] Available at: https://www.popularmechanics.com/science/math/a32957375/mathematicians-boycott-predictive-policing/ [Accessed 5 March 2021].
Marda, V. and Narayan, S., 2020. Data in New Delhi’s Predictive Policing System. In: FAT* 20′: Conference on Fairness, Accountability, and Transparency. [online] New York: Association for Computing Machinery. Available at: https://www.vidushimarda.com/storage/app/media/uploaded-files/fat2020-final586.pdf [Accessed 6 March 2021].
McGrory, K. and Bedi, N., 2020. Targeted. Tampa Bay Times, [online] Available at: https://projects.tampabay.com/projects/2020/investigations/police-pasco-sheriff-targeted/intelligence-led-policing/ [Accessed 5 March 2021].
Mengle, G., 2020. Maharashtra CID website hacked, defaced. The Hindu, [online] Available at: https://www.thehindu.com/news/cities/mumbai/maharashtra-cid-website-hacked-defaced/article31005341.ece [Accessed 6 March 2021].
Minnesota Public Radio News, 2013. Stanek says marijuana shows clear link to violent behavior. [online] Available at: https://www.mprnews.org/story/2013/09/19/daily-circuit-rich-stanek-marijuana [Accessed 5 March 2021].
Office of National Drug Control Policy, 2013. Improving the Measure of Drug-Related Crime. [online] Washington, D.C.: Executive Office of the President of the United States. Available at: https://obamawhitehouse.archives.gov/sites/default/files/ondcp/policy-and-research/drug_crime_report_final.pdf.
Outlook India, 2020. 19,000 CCTV Cameras On Real-Time Streaming At HP Police HQs. [online] Available at: https://www.outlookindia.com/website/story/india-news-19000-cctv-cameras-on-real-time-streaming-at-hp-police-hqs/360917 [Accessed 5 March 2021].
Perry, W., McInnis, B., Price, C., Smith, S. and Hollywood, J., 2013. Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations. [online] RAND Corporation, p.14. Available at: https://www.rand.org/content/dam/rand/pubs/research_reports/RR200/RR233/RAND_RR233.pdf.
Puente, M., 2019. LAPD pioneered predicting crime with data. Many police don’t think it works. LA Times, [online] Available at: https://www.latimes.com/local/lanow/la-me-lapd-precision-policing-data-20190703-story.html [Accessed 5 March 2021].
Saunders, J., 2016. Pitfalls of Predictive Policing. [Blog] The RAND Blog, Available at: https://www.rand.org/blog/2016/10/pitfalls-of-predictive-policing.html [Accessed 5 March 2021].
Singh, K., 2020. Preventing crime before it happens: How data is helping Delhi Police. Hindustan Times, [online] Available at: https://www.hindustantimes.com/delhi/delhi-police-is-using-precrime-data-analysis-to-send-its-men-to-likely-trouble-spots/story-hZcCRyWMVoNSsRhnBNgOHI.html [Accessed 5 March 2021].
Thakur, A. and Nagarajan, R., 2020. Why Minorities Have a Major Presence in Prisons. [online] The Times of India. Available at: https://timesofindia.indiatimes.com/india/in-a-minority-but-a-major-presence-in-our-prisons/articleshow/73266299.cms.
The Hindu, 2020. I-T Dept. to share PAN, bank account data with 10 probe, intel agencies under NATGRID. [online] Available at: https://www.thehindu.com/news/national/i-t-dept-to-share-pan-bank-account-data-with-10-probe-intel-agencies-under-natgrid/article32183200.ece [Accessed 6 March 2021].
Umanadh, J., 2019. Telangana govt denies surveillance snooping on citizens. Deccan Herald, [online] Available at: https://www.deccanherald.com/national/south/telangana-govt-denies-surveillance-snooping-on-citizens-774306.html [Accessed 5 March 2021].
Well, L., 2020. Germany. Automating Society Report 2020. [online] AlgorithmWatch. Available at: https://automatingsociety.algorithmwatch.org/report2020/germany/.
Willingham, A., 2019. Researchers studied nearly 100 million traffic stops and found black motorists are more likely to be pulled over. CNN, [online] Available at: https://edition.cnn.com/2019/03/21/us/police-stops-race-stanford-study-trnd/index.html.
The implementation of predictive policing in India raises crucial questions about its potential impact on both crime prevention and the protection of minority rights. While the intention behind using advanced analytics and technology to forecast criminal activity is to enhance public safety, concerns have been raised about the possibility of reinforcing biases and discriminating against minority communities. It is essential for policymakers to strike a balance, ensuring that predictive policing tools are transparent, accountable, and subject to rigorous scrutiny to prevent any unintentional or systemic discrimination. The ethical deployment of these technologies should prioritize justice, fairness, and the protection of civil liberties to build a safer society without compromising individual rights.