
4 minute read
CCM AI Tool-Kit: Examples
Case: The New Haven Police Department
The police chief of New Haven, Connecticut recently proposed that police reports be written using artificial intelligence. The artificial intelligence (AI) would take the form of a new report system, entitled Draft One, which itself is part of a suite of technology and equipment that the New Haven Police Department could potentially purchase from Axon, a police technology company. If successful, the agreement would entail a five-year $7.6 million contract that would also include more dashboard and body cameras, tasers, and software to live stream drone footage, along with other resources.
The proposal was unanimously accepted by the Board of Alders Public Safety Committee at the City Hall on November 19, 2024. New Haven Police Chief Karl Jacobson described Draft One as a Pilot Program with the goal of decreasing the amount of time that police officers spend writing reports. According to Jacobson, his officers currently spend about 2 to 2.5 hours writing reports on an 8 hour shift, and that the AI technology could reduce that time by 65 percent. He suggests that officers could use the saved time to spend more time patrolling and being visibly present in order to reduce criminal thefts.

Jacobson explained before the Board of Alders (the municipality’s governing body) that the program would begin with ten to fifteen officers utilizing the technology to write the reports, which the AI would generate from uploaded audio sourced from officers’ body cameras. The pilot program would be three to six months and the department would compare the AI reports to human ones written by officers. In a post-hearing interview Jacobson affirmed that every fact generated by the AI is true.
The committee Alders argued that embracing the utilization of AI for this purpose would give their constituents what they want: higher police presence and greater officer accountability. They plan to move forward with the program and return with an update in the next six months.
Despite the benefits of AI in this Case Example, which entail greater time efficiency within the police department which could translate to reduced crime via higher police presence, it’s important to note that making use of AI in this way also carries its share of potential hazards. The American Civil Liberties Union published a White Paper explaining in depth why they believe police departments should avoid using this technology to generate police reports.
Their report notes that generative AI, such as ChatGPT is not always correct, and has been shown to fabricate false facts. They are trained and programmed using information across nearly the whole of the internet, which is helpful in terms of the breadth of information, but also detrimental in that it also absorbs the prejudices and biases that encompass it. Even excluding an explicit error, an AI could still “spin” something in a subtle way that could go bypass an officer’s perception. The implementation of AI is still novel in this respect, and it’s important that the public understands the implications of its use for this specific purpose so that a municipality can make an informed decision on whether they believe it will be good for their community.
Moreover, forcing police officers to explain the reasoning for their use of discretionary power in writing serves as a way to remind them of the legal limits of their authority. That justification for things such as stops, frisks, searches, and consent searches, are also reviewed by their supervisors, who use that to confirm those limits or explain to officers what might be demonstrated as a gap in their knowledge. Shifting to AI drafted reports would remove this accountability within police departments.