How Policing Technologies: Expanding the Discourse Around Defunding the Police
On October 30, 2023 President Biden signed an Executive Order on the implementation of AI Technologies by the Federal Government, within this document the administration stated their goal was to “To promote the equitable treatment of individuals and adhere to the Federal Government’s fundamental obligation to ensure fair and impartial justice for all, with respect to the use of AI in the criminal justice system” (White House, 2023). The question is can AI driven policing technologies be “fair and impartial”?
Artificial technologies (AI), are a suite of technical products that process information, and then make predictions about how users should proceed, and they are supercharging economies across the globe. Leading global management consulting firm Mckinsey estimates that the integration of AI systems into just 19 business verticals, will add an estimated that AI systems will add $3.5 - $5.8 trillion annually to economies in the Global North (Chui et al, 2008). Therefore making the integration of AI systems into society one of the most significant developments of our time (Elias, 2023).
The reason AI driven technologies are referred to as possessing artificial intelligence, is because they perform cognitive tasks normally associated with human beings. Some of the most popular use cases for AI are within the security services. In July 2021 the federal Government Accountability Office (GAO) reported that 42 federal agencies that employ law enforcement officers have used facial recognition technology in some form (GAO, 2021). This is transforming our police forces from people engaged in the monitoring of crime to a technologically enabled system in which police officers are integrating advanced technological products into their day to day activities. This paper focuses on the uses of facial recognition technologies (FRT) by police and the impact this has on discourse around defunding the police.
Policing technologies like all other AI systems are taught to perform their tasks through a process called machine-learning. Machine learning is an engineering protocol during which AI systems are programmed to identify patterns within large amounts of data, (called training data), then use these patterns to develop statistical models which provide algorithms (which are software programs) with instructions on how to perform particular tasks (O’Neil, 2016). It is often assumed the determinations made by AI systems are objective because they are based on algorithmic decision making which are driven by mathematical reasoning. However, data scientist Cathy O’Neil points out that algorithmic decision making is purely subjective (O’Neil, 2016). This observation is innocuous if left in the abstract. However, subjective decision making when applied to whether to arrest someone could ruin a person’s life. This can be seen in how facial recognition technologies work. In most cases police forces use systems that compare the images of people’s faces captured by close circuit television (CCTV) footage to digital images of people in mugshot databases (Crocker, 2020). FRT systems then predict if the two images are a “good fit” by using its algorithms to apply a predetermined statistical formula or formulas which define if images fall within the match range. This matching determination however does not tell police officers if the people in the two images are the same person, or if the person captured by CCTV or the person in the mugshot databases were at the scene of the crime (Garvie e al, 2016).
In 2018 Computer scientists Joy Buolamwini and Timni Gebru began investigating the efficacy of facial recognition technologies (FRTs) in their paper Gender Shades. Buolamwini and Gebru found that commercial FRTs developed by Amazon, Microsoft & IBM had a 34 percent error rate when shown pictures of women with dark skin and only a 0.3 percent error rate when exposed to images of white men. This is a phenomenon called algorithmic bias, which refers to a situation in which AI and other technical systems express the same level of bias found in wider society. In Buolamwini and Gebru’s paper the reason put forth for the high error rates for dark skinned women was due to lack of diversity in the training datasets used by teams at Amazon, Microsoft and IBM (Buolamwini and Gebru, 2018). There are many reasons for the lack of diversity in facial recognition training datasets; this paper explores one, namely how the historical racialization of personhood shapes expectations about the type of person functional FRTs should recognize with high degrees of accuracy. And which types of people can be acceptably misidentified.
The use of technology to survellie the movements of Black people has historical roots. In this next section of this paper, I argue policing has always been a technical project by drawing a line between New York City's 1713 Lantern Law and contemporary police use of facial recognition (Brown, 2015). Although slavery is often discussed in the Southern context African people were also enslaved in the North. The first documented Black person to come to Manhattan was a servant called Jan Rodrigues. He was brought to the island by Dutch merchant Thijs Mossell in 1613 (Hodges, 1999). Over time the Africans who were brought to Manhattan were forced into the ranks of the enslaved. This could be seen in the 1698 census which stated that 41 percent of households in New York City had at least one enslaved person (Hodges, 1999). In response to this free and enslaved Black people alike began to resist their enslavement, resulting in intense public debate on how to control the slave population (Hodges, 1999).
One of the laws that addressed this was the 1713 Law for Regulating Negro and Indian Slaves in the nightime. The 1713 Law banned Indian or Black slaves over the age of 14 from being outside at night without a lantern or lighted candle. This stipulation provided white people with the technology needed to continually surveil the movements of Black New Yorkers (Browne, 2015). Facial recognition technologies continue this work by replacing the light provided by lanterns with the ability to analyze CCTV cameras and continually track the movements of contemporary Black people. The financial incentives facial recognition companies reap by assisting police forces to continue their racist practices is leading to the wrongful identification of Black people particularly Black men. This is akin to the incentivization of wrongful capture of Black people under the 1713 Lantern Law (Hodges, 2010). This shows the use of technology to surveil Black people is not new. Facial recognition is simply a new iteration of this historical practice.
Therefore the central argument presented in this paper is that the use of facial recognition technologies (FRTs) by police advances the way in which the abolitionist community should think about policing. In that we can no longer protest for the indictment of police officers who engage in fatal encounters with Black people because policing tasks are now being carried out by AI. For example, in 2020 Robert Williams an African American man from Detroit was arrested after being wrongfully identified as a criminal suspect by a facial recognition system used by the Detroit Police Department (DPD). After being in custody for almost three days police officers tried to force Williams to make a confession by presenting him with CCTV footage showing “him” stealing jewelry from a store in the surrounding area. However, it was not until Williams held up the CCTV image next to his face that officers realized he had been wrongly identified (Hill, 2020). Police forces claim FRT determinations are not used to establish probable cause (US Department of Homeland Security, 2017) but in William’s case officers used an FRT determination
to pressure him to confess to a crime he did not commit. This suggests members of the Detroit Police Department were using evidence generated by facial recognition system to establish probable cause. Police abolitionist Angela Davis argues the denial of a person’s freedom is a form of psychological harm (Davis, 1970). Therefore, Williams was harmed by the DPD in this instance and this harm was precipitated by a facial recognition system. Then the fact officers from the Detroit Police Department tried to use a positive match made by a facial recognition system to take Williams into custody suggesting they denied Williams the presumption he was innocent until proven guilty which is a violation of his fifth amendment right (Schneider, 2018).
Despite the prospect that the use of facial recognition technologies (FRTs) deny the constitutional rights of Black people has not impacted the sales. In fact the FRT market is set to grow from $5.1 billion in 2021 to $12.6 billion in 2028 (Research and Markets, 2022). This is because police forces are not the only groups using facial recognition technologies to surveille the movements of people. Facial recognition technology has been integrated with doorbells, (Kelley and Guariglia, 2020) used by retail outlets to monitor theft (Gershgorn, 2021) and integrated into electric cars (Waxler, 2021). This mass adoption of facial recognition technologies provides a new set of tools for the surveillance of all people but is particularly dangerous for those who are racialized as Black. This reinforces the idea Black people are a “problem” that needs to be solved (Dubois, 1903). The fact that the facial recognition trade builds immense wealth for white people by selling their technologies to law enforcement agencies engaged in racist practices (German, 2020) makes the facial recognition trade an example of a system run on racial capitalistic logics.
The term racial capitalism builds on the definition of capitalism offered by Karl Marx in Das Capital Volume One, in which Marx states that the owners of production exploit low wage workers to build wealth (Marx, 1867). However neither group is assigned a race Marx was white and so the assumption is was directing his comments to other white people. However racial capitalism takes note of the unique ways Black and non white people are exploited within capitalist systems because of their race. The immense investment needed to launch R&D based tech companies means the facial recognition companies are backed by venture capitalists. Black startup founders currently receive less than 1 percent of venture funding in the United States (Fonrouge, 2023). Therefore facial recognition companies tend to be run by non Black people because of this funding gap. However the people who help FRT companies generate profits are most likely to be Black because police forces arrest Black people more than whites (Srikanth, 2020). When police forces use facial recognition technology (FRT) to arrest Black people, these people are more exploited than people from other groups because the dark faces improve FRT functionality by diversifying the system's dataset. This act of
training facial recognition systems is a form of labor for which arrestees are not paid. Making the facial recognition trade an excellent example of racial capitalism.
The large-scale adoption of facial recognition technologies by police forces (GAO, 2021) suggests there is an appetite for the continued use of AI technology in the administration of public safety. Therefore, police abolitionists may view this an opportunity to update their call to defund the police by specifying the need to end AI driven policing in its present form. Given that the call for abolition includes the demand to end harmful practices and replace them with life affirming practices. This may be a time to advocate for using recognition technologies to improve public safety. For example, recognition systems could be used to find out why people steal food, squat on private property or register their children in better funded school districts than the ones in which they live . Then use this information to guide community investment strategies designed to address underlying reasons for these crimes, alongside engaging in advocacy for the decriminalization of these activities (Vitale, 2021). Please note that use cases for recognition technologies take out the facial identification element to maintain the privacy of impacted groups.
In conclusion the integration of AI technologies in policing has expanded the discourse around defunding the police. Yet the defund discourse is often limited to officers themselves. This results in analogue arguments being used to defund technically enabled policing practices. The use of AI tools to guide arrest decisions has created a gap in the analysis around what it means to be policed. This gap could be filled by adding a technical analysis to police abolition work. Further to this, policing technologies are introduced to the public in press conferences which tout how these systems are moving policing into the 21st Century. However this is not true, technology has always been used to track and surveille Black people, whether that takes the form of the Lantern in 1713 or facial recognition in 2023. This is because the commercial policing tech sector is a capitalistic project which holds in place, then projects our histories of racist policing practices into the future. Despite this policing technologies are being rapidly integrated into everyday life. Therefore 21st century police abolitionists should think about how to repurpose policing technologies within alternative public safety frameworks (Benjamin, 2019) AI driven policing is not in our future, it is happening now in every town and city in the United States. In order to push back on this, police abolitionists should integrate the insights offered by sociologist Ruha Benjamin, who writes about using AI technologies to advance the abolitionist project.
Bibliography
Don Babwin (Aug 24 2021), Chicago watchdog harshly criticizes Shotspotter system, PBS, find the link here https://www.pbs.org/newshour/nation/chicago-watchdog-harshly-criticizes-sho tspotter-system
Ruha Benjamin (2019) Race After technology: Abolitionist Tools for the New Jim Code,Page 161-162, Polity Press, London, England
Deborah Bloom and Jareen Imam (December 8, 2014) New York man dies after chokehold by police, CNN, read the article here https://www.cnn.com/2014/07/20/justice/ny-chokehold-death/index.html
Nicholas Bogel-Burroughs and Will Wright (April 19, 2021) Little has been said about the $20 bill that brought officers to the scene, New York Times, read it here https://www.nytimes.com/2021/04/19/us/george-floyd-bill-counterfeit.html
Joy Buolamwini and Timnit Gebru (2018) Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, Page 8; 10, (please put the title of the paper in your browser)
Simone Browne (2015) Dark Matters: On the Surveillance of Blackness, Page 77-78, Duke University Press, Durham and London
Danielle Brown (2018) Google diversity annual report 2018, Page 5, Google, read the report here https://static.googleusercontent.com/media/diversity.google/en//static/pdf/Goo gle Diversity annual report 2018.pdf
Mona Chalabi (October 27, 2016) Weapons of Math Destruction: Cathy O'Neil adds up the damage of algorithms, Guardian Newspaper, this is where O’Neil says algorithms are opinions written in code (paraphrasing) read the article here https://www.theguardian.com/books/2016/oct/27/cathy-oneil-weapons-of-math -destruction-algorithms-big-data
Michael Chui et al (2018) Notes from the AI frontier: Applications and value of deep learning, Mckinsey Global Institute, read the report here https://www.mckinsey.com/featured-insights/artificial-intelligence/notes-from-t he-ai-frontier-applications-and-value-of-deep-learning
Clinton Colmenares (June 24, 2020) Unfounded fear helps fuel police violence, Furum University, read the report here https://www.furman.edu/news/2020/06/24/unfounded-fear-helps-fuel-police-v iolence/https://www.furman.edu/news/2020/06/24/unfounded-fear-helps-fuelpolice-violence/
Courtney Connley (June 1, 2021) These are the 10 most in-demand A.I. jobs according to Indeed and they all pay at least $95,000, CNBC, read the article here https://www.cnbc.com/2021/06/01/10-of-the-most-in-demand-ai-jobs-that-pay -at-least-95000.html
Kade Crocker (June 16, 2020) How is Face Recognition Surveillance Technology Racist?, ACLU, Massachusetts Technology for Liberty Project, read the article here https://www.aclu.org/news/privacy-technology/how-is-face-recognition-surveill ance-technology-racist
Jay Croft (June 21, 2017) Philando Castile shooting:Dashcam video shows rapid event, CNN, read the article here https://www.cnn.com/2017/06/20/us/philando-castile-shooting-dashcam/index .html
Angela Y.Davis (1970) First Lecture in Liberation published as a forward to: Narrative of the Life of Frederick Douglass: An American Slave written by himself: A New Critical Edition by Angela Y Davis featuring her Lectures on Liberation (published 2010), Page 50 Open Media Series / City Lights Books, San Francisco
Angela Y.Davis (1995) Violence Against Women and the Ongoing Challenge to Racism, Page 140, Kitchen Table/Women of Color, Latham, NY
Angela Y.Davis (2003) Are Prisons Obsolete?, Pages 28, Seven Stories Press, Washington DC
Fredrick Douglas (1883) Address of Hon Fred Douglass, delivered before the National Convention of Colored Men, at Louisville, Ky, Library of Congress, Louisville, Ky, read the speech here https://omeka.coloredconventions.org/items/show/554
W.E.B. Du Bois (1901) The Spawn of Slavery: The Convict-Lease System in the South, pages 3-4, The Missionary Review of the World
W.E.B. Du Bois (1903; reprinted 2017) The Souls of Black Folk, pages 6-7 Penguin Books, New York
Jennifer Elias (October 3, 2019) Google contractor reportedly tricked homeless people into face scans, CNBC, read the article here https://www.cnbc.com/2019/10/03/google-contractor-reportedly-tricked-homel ess-people-into-face-scans.html
Jennifer Elias (April 17, 2023) Google CEO Sundar Pichai warns society to brace for impact of A.I. acceleration, says ‘it’s not for a company to decide’, read the article here https://www.cnbc.com/2023/04/17/google-ceo-sundar-pichai-warns-society-tobrace-for-impact-of-ai-acceleration.html
Mary Elliot and Jazmine Hughes (August 19, 2019) No. 1 /Slavery, power and the human ost 1455 - 1775, New York Times Magazine, read the article here https://www.nytimes.com/interactive/2019/08/19/magazine/history-slavery-sm ithsonian.html
Frantz Fanon (1961) The Wretched of the Earth, Page 4-5, Grove Press, New York, 60th Anniversary edition
Todd Feathers (July 19, 2021) Gunshot-Detecting Tech Is Summoning Armed Police to Black Neighborhoods: Vice, read the article here https://www.vice.com/en/article/88nd3z/gunshot-detecting-tech-is-summoning -armed-police-to-black-neighborhoods
Federal Reserve Bulletin (1923) Saint Louis, read it here https://fraser.stlouisfed.org/files/docs/publications/FRB/pages/1920-1924/2639 6 1920-1924.pdf
Gabrielle Fonrouge (February 2, 2023) Venture capital for Black entrepreneurs plummeted 45% in 2022, data shows, CNBC, read the article here https://www.cnbc.com/2023/02/02/venture-capital-black-founders-plummeted. html
Philip S. Forner (ed) (1955) The Life and Writings of Frederick Douglass.Volume 4 Reconstruction and After, page 347, International Publishers, New York
Michael German (August 27, 2020) Hidden in Plain Sight: Racism, White Supremacy, and Far-Right Militancy in Law Enforcement, Brennan Center, read the article here https://www.brennancenter.org/our-work/research-reports/hidden-plain-sightracism-white-supremacy-and-far-right-militancy-law
Dave Gershgorn (July 12, 2021) Retail stores are packed with unchecked facial recognition, civil rights organizations say: Lowe’s, Albertsons, Macy’s are already using the tech, Verge, read the story here https://www.theverge.com/2021/7/14/22576236/retail-stores-facial-recognitioncivil-rights-organizations-ban
Ruth Wilson Gilmore (2017) Abolition Geography and the Problem of Innocence, Page 473, Tabula Rasa,
Phillip Atiba Goff (no. 2 2016) Identity traps: How to think about race & policing, pages 10-22, Behavioral Science & Policy 2, doi 10.1353/bsp.2016.0012.
Cassie Gooptar (March 28, 2023) How we uncovered the Guardian founders’ links to slavery, The Guardian Newspaper, read the article here https://www.theguardian.com/news/ng-interactive/2023/mar/28/the-cotton-th read-guardian-founders-slavery-john-edward-taylor
John Gramlich (May 21, 2019) From police to parole, black and white Americans differ widely in their views of criminal justice system, Pew Research Center, read the report here
https://www.pewresearch.org/fact-tank/2019/05/21/from-police-to-parole-blac k-and-white-americans-differ-widely-in-their-views-of-criminal-justice-system/
Kara Grant (September 22, 2020) Shotspotter Sensors Send SDPD Officers to False Alarms More Often Than Advertised, Voice of San Diego, read the article here https://voiceofsandiego.org/2020/09/22/shotspotter-sensors-send-sdpd-officer s-to-false-alarms-more-often-than-advertised/
Emma Griffin (May 15, 2014) Manchester in the 19th century: British Library, find the article here https://www.bl.uk/romantics-and-victorians/articles/manchester-in-the-19th-ce ntury
Samuel R. Gross (September 1, 2022) National Registry of Exonerations Wrongful Convictions in the United States 2022, University of Michigan, read the article here https://www.law.umich.edu/special/exoneration/Documents/Race%20Report%2 0Preview.pdf
JooHee Han (March, 2022) The tech industry talks about boosting diversity, but research shows little improvement, University of Massachusetts, Amherst, read the article here https://www.umass.edu/employmentequity/diversity-reports
Caroline Haskins and Donald Tomaskovic-Devey (April 6, 2021) The NYPD Has Misled The Public About Its Use Of Facial Recognition Tool Clearview AI, Buzzfeed, read the article here https://www.buzzfeednews.com/article/carolinehaskins1/nypd-has-misled-public -about-clearview-ai-use
Shahram Heshmat Ph.D. (April 23, 2015) What is Confirmation Bias?, Psychology Today, read the article here https://www.psychologytoday.com/us/blog/science-choice/201504/what-is-con firmation-bias
Kashmir Hill (Published June 24, 2020 Updated Aug. 3, 2020) Wrongfully Accused by an Algorithm, New York Times, read the article here https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.ht ml
Kashmir Hill (December 1, 2020) Another Arrest, and Jail Time, Due to a Bad Facial Recognition Match, New York Times, read the article here https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentif y-jail.html
Elizabeth Hinton (May 1, 2018) An Unjust Burden:The Disparate Treatment of Black Americans in the Criminal Justice System, Page 2, Vera Institute for Justice, read the article here https://www.vera.org/downloads/publications/for-the-record-unjust-burden-rac ial-disparities.pdf
Ryan Mac (March 31, 2023) ‘Thousands of Dollars for Something I Didn't Do’, New York Times, read the article here https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arres ts html
Graham Russell (1999) The Root & Branch: African Americans in New York and East Jersey 1613-1863, pages 6-7 40-41 63-65, University of North Carolina Press, Chapel Hill & London
Graham Russell Gao Hodges (2010) David Ruggles: A Radical Black Abolitionist and The Underground Railroad in New York City, pages 35 61, University of North Carolina Press, Chapel Hill
Stephanie. E Jones Rogers (2019) They Were Here Property:White Women As Slave Owners in the American South , Pages xiv 21 140-141, Yale University Press, New Haven and London
Maraieme Kaba (June 12, 2020) Yes, We Mean Literally Abolish the Police:Because Reform Won't Happen, New York Times, read the article here https://www.nytimes.com/2020/06/12/opinion/sunday/floyd-abolish-defund-po lice.html
K. Eugene Shutler, The Fugitive Slave Act of 1850, Constitutional Rights Foundation, Los Angeles, read it here https://www.crf-usa.org/images/pdf/Fugitive-Slave-Law-1850.pdf
Jason Kelley (June 10, 2020) Amazon Ring Must End Its Dangerous Partnerships With Police, Electronic Frontier Foundation, read the article here https://www.eff.org/deeplinks/2020/06/amazon-ring-must-end-its-dangerouspartnerships-police
Nicole Karlis and Matthew Guariglia (April 6, 2023) AI company harvested billions of Facebook photos for a facial recognition database it sold to police, Salon, read the article here https://www.salon.com/2023/04/06/ai-company-harvested-billions-of-facebook -photos-for-a-facial-recognition-database-it-sold-to-police/
Khari Johnson (March 7, 2022) How Wrongful Arrests Based on AI Derailed 3 Men's Lives: Wired, read the article here https://www.wired.com/story/wrongful-arrests-ai-derailed-3-mens-lives/
Library of Congress, Constitution of the United States, Article One, read it here https://www.loc.gov/collections/century-of-lawmaking/articles-and-essays/cent ury-presentations/constitution/?loclr=bloglaw
David K. Li (May 29, 2020) George Floyd told police he was struggling to breathe before an officer put a knee on his neck, NBC News, read the article here https://www.nbcnews.com/news/us-news/george-floyd-told-police-he-was-stru ggling-breathe-officer-put-n1218556
Kelly Main and Cade Metz (June 15, 2022) How a Brilliant Artificial Intelligence-Inspired Solution Can Cut Your Company's Costs by 20 Percent, Inc., read the article here
https://www.inc.com/kelly-main/artificial-intelligence-efficiency-productivity.ht ml
Karl Marx (Published 1990, original pub date 1867) Das Capital: Volume I, Page 449, Penguin Classics, New York
Robert McNama (June 30, 2019) King Cotton and the Economy of the Old South, read the article here https://www.thoughtco.com/king-cotton-1773328
Lindsay-Rae McIntyre (November 14, 2018) Diversity and inclusion update: The Journey Continues, Microsoft, read the article here https://blogs.microsoft.com/blog/2018/11/14/diversity-and-inclusion-update-the -journey-continues/
Brian Mosley (December 1, 2022) NITRD: Releases Supplement to President’s FY23 Annual Budget, Giving the Community First Opportunity to see Details on IT and AI R&D Expenditures for FY22, Computing Research Association, read the article here https://cra.org/govaffairs/blog/2022/12/nitrd-releases-fy23-supplement/
Khalil Gibran Muhammad (2010) The Foundational Lawlessness of the Law Itself:Racial Criminalization & the Punitive Roots of Punishment in America, Page 3, Harvard University Press, Cambridge. Massachusetts, London, England
National Archives (Ratified 1788) The U.S. Constitution, Article 1. Section 2. The "Three-Fifths Clause, read them here https://www.thirteen.org/wnet/slavery/experience/legal/docs2.html
Mutale Nkonde (June 10, 2019) A.I. Is Not as Advanced as You Might Think: It starts with the systems it was built off of, Zora, read the article here https://zora.medium.com/a-i-is-not-as-advanced-as-you-might-think-97657e9ee cdc
Mutale Nkonde (2019) Automated Anti-Blackness: Facial Recognition Use in Brooklyn, Page 30-36, Harvard African American Policy Journal, Cambridge. Massachusetts, London, England
Safiya U Noble (2018) Algorithms of Oppression: How Search Engines Reinforce Racism, Page 70, New York University Press, New York City.
New York Times (April 14, 2023) A Partial List of U.S. Mass Shootings in 2023 , New York Times, read the article here https://www.nytimes.com/article/mass-shootings-2023.html
Cathy O'Neil (2016) Weapons of Math Destruction:How Big Data Increases Inequality and Threatens Democracy, Page 19, Broadway Books,New York
Faith Popcorn (December 5, 2016) Artificial Intelligence May Usher in a New Golden Age, New York Times, read the article here https://www.nytimes.com/roomfordebate/2016/12/05/is-artificial-intelligence-t aking-over-our-lives/artificial-intelligence-may-usher-in-a-new-golden-age
Derecka Purnell (July 6, 2020) How I Became a Police Abolitionist, Atlantic Monthly, read the article here https://www.theatlantic.com/ideas/archive/2020/07/how-i-became-police-aboli tionist/613540/
Danielle Remmerswaal et al (E Pub July 13, 2013) cognitive bias in action: evidence for a reciprocal relation between confirmation bias and fear in children., Journal of Experimental Psychology, read the article here https://pubmed.ncbi.nlm.nih.gov/23933089/
Research and Markets (May 2, 2022) Global Facial Recognition Market Forecast Report 2021-2028, read the article here https://www.yahoo.com/now/global-facial-recognition-market-forecast-0953007 08.html
Cedric Robinson (2019) Racial Capitalism, Black Intellectualism and Cultures of Resistance, Page 79, Pluto Press, London
Darcel Rockett (March 7, 2019) Poor people often can’t afford to pay bail even when they're innocent. An app developed in Chicago offers help using your spare change, Chicago Tribune, read the article here https://www.chicagotribune.com/lifestyles/ct-life-appolition-making-bail-201901 24-story.html
Chris Mills Rodrigo (July 29, 2021) Activists demand Chicago end ShotSpotter contract, Hill, read the article here https://thehill.com/policy/technology/565537-activists-demand-chicago-end-sh otspotter-contract/
Rich Morin and Renee Stepler (September 29, 2016) The Racial Confidence Gap in Police Performance, read the report here https://www.pewresearch.org/social-trends/2016/09/29/the-racial-confidencegap-in-police-performance/
Steven Rosenbalm (March 8, 2022) Big Tech Is Spending Billions on AI Research. Investors Should Keep an Eye Out, Wall Street Journal, read the article here https://www.wsj.com/articles/big-tech-is-spending-billions-on-ai-research-inves tors-should-keep-an-eye-out-11646740800
Lawrence Rosenthal (June 12, 2020) Police violence is mostly rooted in fear. Ignoring that makes reform harder.NBC News, read the article here https://www.nbcnews.com/think/opinion/police-violence-mostly-rooted-fear-ig noring-makes-reform-harder-ncna1230266
Steven Rosenbush (March 8, 2022) Big Tech Is Spending Billions on AI Research. Investors Should Keep an Eye Out, Wall Street Journal, read the article here https://www.wsj.com/articles/big-tech-is-spending-billions-on-ai-research-inves tors-should-keep-an-eye-out-11646740800
Geetika Rudra (August 12, 2013) About 90 Percent of New Yorkers Stopped and Frisked Were 'Innocent,' Says NYCLU, ABC News, read the article here https://abcnews.go.com/blogs/headlines/2013/08/about-90-percent-of-new-yo rkers-stopped-and-frisked-were-innocent-says-nyclu
Matthew Sandler (2020) Fugitive Romance The Black Romantic Revolution: Abolitionist Poets at the End of Slavery, Pages 1-3, Verso, London
Fr. Matthew P. Schneider (July 19, 2018) Face Recognition and “Innocent Until Proven Guilty, Pathos, read the article here https://www.patheos.com/blogs/throughcatholiclenses/2018/07/will-we-still-be -innocent-until-proven-guilty/
Samuel Sigal (October 20, 2022) Joy Buolamwini saw first-hand the harm of AI bias. Now she’s challenging tech to do better, Vox, read the article here https://www.vox.com/future-perfect/23365558/future-perfect-50-ai-joy-buolam wini-founder-algorithmic-justice-league
Anagha Srikanth (June 11, 2020) Black people 5 times more likely to be arrested than whites, according to new analysis, read the article here https://thehill.com/changing-america/respect/equality/502277-black-people-5-ti mes-more-likely-to-be-arrested-than-whites/
Elaisha Stokes (November 19, 2020) Wrongful arrest exposes racial bias in facial recognition technology, CBS News, read the article here https://www.cbsnews.com/news/detroit-facial-recognition-surveillance-cameraracial-bias-crime/
Bryan Stevenson The Trans Atlantic Slave Trade, pages 13-23, The Equal Justice Initiative , read the article here https://eji.org/wp-content/uploads/2005/11/transatlantic-report-PDF-web.pdf
Scott Trust (March 28, 2023) The Scott Trust Legacies of Enslavement report, Guardian Newspaper, Manchester, England, read the article here https://www.theguardian.com/the-scott-trust/ng-interactive/2023/mar/28/the -scott-trust-legacies-of-enslavement-report
US Government Accountability Office (July 13, 2021) Facial Recognition Technology: Federal Law Enforcement Agencies Should Have Better Awareness of Systems Used By Employees, read the article here https://www.gao.gov/products/gao-21-105309
U.S. Department. of Justice (2015) Changes Necessary to Remedy Ferguson’s UnlawfulLaw Enforcement Practices and Repair Community Trust, page 141 - 162, read the report here https://www.justice.gov/sites/default/files/opa/press-releases/attachments/201 5/03/04/ferguson police department report.pdf
Alex S. Vitale (2021 Updated Edition ) The End of Policing: The Problem is not police training, police diversity, or police methods. The problem is policing itself, Page xxii: 3 31;47, Verso, London
Erik Waxler (Dec 02, 2021 and last updated 2021-12-02 17 2) Professors working on self-driving car technology that uses facial recognition system, WTFS Tampa Bay, read the article here https://www.abcactionnews.com/news/region-polk/lakeland/professors-workin g-on-self-driving-car-technology-that-uses-facial-recognition-system
Maxine Williams (July 12, 2018) Facebook 2018 Diversity Report: Reflecting on our Journey, Facebook, read the article here https://about.fb.com/news/2018/07/diversity-report/
White House (October 30, 2023) Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, read the order here https://www.whitehouse.gov/briefing-room/presidential-actions/2023/10/30/ex
ecutive-order-on-the-safe-secure-and-trustworthy-development-and-use-of-arti
ficial-intelligence/