Search About Newsletters Donate
Support independent, nonprofit journalism.

Become a member of The Marshall Project during our year-end member drive. Our journalism has tremendous power to drive change, but we can’t do it without your support.

Closing Argument

More Police Are Using Your Cameras for Video Evidence

Police “nerve centers” are blurring the line between public and private surveillance.

Detective Marco Christlieb works at a station in the Real-Time Crime Center at St. Louis Metropolitan Police Headquarters on Dec. 21, 2023,  in St. Louis, Missouri.
Detective Marco Christlieb works at a station in the Real-Time Crime Center at St. Louis Metropolitan Police Headquarters on Dec. 21, 2023, in St. Louis, Missouri.

This is The Marshall Project’s Closing Argument newsletter, a weekly deep dive into a key criminal justice issue. Want this delivered to your inbox? Subscribe to future newsletters here.

Los Angeles and Washington, D.C., are among major cities slated to launch a Real-Time Crime Center in the coming months, billed as a kind of “nerve center” for the integration of police technology and data.

These centers vary, but tend to integrate public surveillance video with other police technology like license plate readers, facial recognition, drone cameras, body camera footage and gunshot detection software. As Wired Magazine reported last summer, the centers have been popping up across the country, with at least 135 now running, according to one count.

Proponents say the centers make it easier for police to solve crimes and find suspects. Opponents worry both about the invasion of privacy, and that increased surveillance will disproportionately target Black people and other marginalized communities.

Increasingly, most of these facilities functionally blur the lines between private and public surveillance sources. According to data kept by the Electronic Frontier Foundation, a digital rights advocacy non-profit, in Atlanta and Albuquerque the number of private cameras providing data to law enforcement dramatically outnumber public ones.

Private security footage is nothing new to criminal investigations, but two factors are rapidly changing the landscape: huge growth in the number of devices with cameras, and the fact that footage usually lands in a cloud server, rather than on a tape.

When a third party maintains the footage on the cloud, it gives police the ability to seek the images directly from the storage company, rather than from the resident or business owner who controls the recording device. In 2022, the Ring security company, owned by Amazon, admitted that it had provided audio and video from customer doorbells to police without user consent at least 11 times. The company cited “exigent circumstances.”

In another case, police served a search warrant on Ring, rather than on Michael Larkin, an Ohio homeowner whose camera footage officers wanted. The company informed him that it was obligated to send footage from more than 20 cameras, “whether or not Larkin was willing to share it himself,” Politico reported.

In thousands of cities and towns, camera owners can opt into programs that give police access to their camera footage — sometimes live-streamed, sometimes after a specific request by police.

That footage can, in turn, help pull other novel kinds of surveillance into the mix. In San Francisco, investigators trying to solve a hit-and-run were reviewing doorbell camera footage when they noticed a Waymo self-driving vehicle — which records video — nearby and around the time of the incident. The case was one of 10 discovered by Bloomberg News, where police issued search warrants to the operators of self-driving taxi services — an avenue that will be increasingly possible as the cars become more available.

Cars aren’t the only autonomous machines that could be recruited for surveillance. The tech outlet 404 Media found that in Los Angeles, robot food delivery company Serve Robotics has provided footage to the LAPD as evidence in at least one criminal case. The robot itself was the target of the crime — an attempted “bot-napping” — but the company’s policies are vague, 404 reported, and could allow for footage to be shared in cases where the bots just happen to capture something of interest.

While some private cameras may stumble upon something relevant to the police, others go looking for it. This week, the city of St. Louis issued a cease-and-desist letter over an entrepreneur's plan to operate a private drone security program pitched as a crime deterrent.

Jamiles Lartey Twitter Email is a New Orleans-based staff writer for The Marshall Project. Previously, he worked as a reporter for the Guardian covering issues of criminal justice, race and policing. Jamiles was a member of the team behind the award-winning online database “The Counted,” tracking police violence in 2015 and 2016. In 2016, he was named “Michael J. Feeney Emerging Journalist of the Year” by the National Association of Black Journalists.