INSUBCONTINENT EXCLUSIVE:
According to police records submitted to the city council, the network "only proved useful in a single case." Investigating the tension
between these claims, the Post suggested we may never know how many suspects were misidentified or what steps police took to ensure
responsible use of the controversial live feeds.In the US, New Orleans stands out for taking a step further than law enforcement in other
regions by using live feeds from facial recognition cameras to make immediate arrests, the Post noted
"explicitly bar" the practice.Lagarde told the Post that police cannot "directly" search for suspects on the camera network or add suspects
to the watchlist in real time
Reese Harper, an NOPD spokesperson, told the Post that his department "does not own, rely on, manage, or condone the use by members of the
department of any artificial intelligence systems associated with the vast network of Project Nola crime cameras."In a federally mandated
2023 audit, New Orleans police complained that complying with the ordinance took too long and "often" resulted in no matches
That could mean the tech is flawed, or it could be a sign that the process was working as intended to prevent wrongful arrests.The Post
noted that in total, "at least eight Americans have been wrongfully arrested due to facial recognition," as, when both police and AI
"This is the stuff of authoritarian surveillance states and has no place in American policing."Project Nola did not immediately respond to
Ars' request for comment.