The next step in surveillance cameras: Finding your friends

A gray-haired man walks through the office lobby with a cup of coffee in hand, staring straight ahead as he passes the entryway.

He seems unaware that he is being tracked by a network of cameras that can detect not only where he is but also who he is with.

Surveillance technology has known you for a long time. Now, with the help of artificial intelligence, it tries to find out who your friends are.

With a few clicks, this “co-appearance” or “correlation analysis” software can find anyone who appeared in the surveillance frames within a few minutes of the gray-haired man last month, get those who may be close to him. one or two times, and zero in on someone who shows up 14 times. The software can then mark potential interactions between the two men, who are now considered likely partners, on a searchable calendar.

Vintra, the San Jose-based company that demonstrated the technology in an industry video presentation last year, sells the co-appearance feature as part of an array of video analysis tools. The company boasts on its website about ties to the San Francisco 49ers and a Florida police department. The Internal Revenue Service and additional police departments across the country are paying for Vintra’s services, according to a government contracting database.

Although co-appearance technology is already used in authoritarian regimes such as China, Vintra is the first company to sell it in the West, industry specialists said.

In the first frame, the presenter introduces a “target.” In the second, he found people who appeared in the same frame as him for 10 minutes. In the third, a camera captures a “companion” of the first person.

(ipvm)

But the company is one of many testing new AI and surveillance applications with little public scrutiny and few formal safeguards against privacy invasions. In January, for example, New York state officials criticized the company that owns Madison Square Garden for using facial recognition technology to bar employees of law firms suing the company from attending. in arena activities.

Industry experts and watchdogs say that if the co-appearance tool isn’t used now — and one analyst expressed confidence that it is — it could become more reliable and more widely available as the capabilities of artificial intelligence is developing.

None of the entities that do business with Vintra contacted by The Times identified using the co-appearance feature in Vintra’s software package. But others have not openly rejected it.

The Chinese government, which has been the most aggressive in using surveillance and AI to control its population, is using co-appearance search to find protesters and dissidents by combining video with a vast network of databases, something that Vintra and its clients cannot do. , said Conor Healy, director of government research for IPVM, the surveillance research group that hosted Vintra’s presentation last year. Vintra’s technology could be used to create a “more basic version” of Chinese government capabilities, he said.

Some states and local governments in the US limit the use of facial recognition, particularly in policing, but no federal law applies. There are no laws that expressly prohibit police from using co-appearance searches like Vintra, “but it’s an open question” whether doing so would violate constitutionally protected rights to free assembly and protections against unauthorized searches, according to Clare Garvie, a specialist in monitoring technology with the National Assn. of Criminal Defense Lawyers. Few states have any restrictions on how private entities can use facial recognition.

The Los Angeles Police Department ended a predictive policing program, known as PredPol, in 2020 amid criticism that it did not stop crime and led to heavier policing of Black and Latino neighborhoods. The program uses AI to analyze large amounts of data, including suspected gang associates, in an effort to predict in real time where property crimes will occur.

Without national laws, many police departments and private companies must weigh the balance of security and privacy on their own.

“This is the Orwellian future come to life,” said Sen. Edward J. Markey, a Massachusetts Democrat. “A shocking state of surveillance where you are being tracked, marked and categorized for use by public and private sector entities – without your knowledge.”

Markey plans to reintroduce a bill in the coming weeks that would stop the use of facial recognition and biometric technologies in federal law enforcement and require local and state governments to ban them as a condition of winning federal grants.

For now, some departments say they don’t have to opt out because of reliability concerns. But as technology advances, they can.

Presenting Vintra, a software company based in San Jose "correlation analysis" to IPVM, a subscriber research group, last year.

Vintra, a San Jose-based software company, presented “correlation analysis” to IPVM, a subscriber research group, last year.

(ipvm)

Vintra executives did not return multiple calls and emails from The Times.

But the company’s chief executive, Brent Boekestein, was more forthcoming about the technology’s potential uses during IPVM’s video presentation.

“You can go in here and create a target, based on this person, and then see who is associated with this person,” Boekestein said. “You can start building a network.”

He added that “96% of the time, there is no event of interest to security but there is always information that the system creates.”

Four agencies that shared the San Jose transit station used in Vintra’s presentation denied that their cameras were used to make the company’s video.

Two companies listed on Vintra’s website, the 49ers and Moderna, the drug company that makes one of the most widely used COVID-19 vaccines, did not respond to emails.

Several police departments have acknowledged working with Vintra, but none have explicitly said they conducted a search warrant.

Brian Jackson, assistant chief of police in Lincoln, Neb., said his department uses Vintra software to save time analyzing hours of video by quickly looking for patterns like blue cars and other items that fit the descriptions used to solve particular crimes. But the cameras linked to his department — including Ring cameras and those used by businesses — aren’t fast enough to match faces, he said.

“There are limitations. It’s not a magic technology,” he said. “It requires proper inputs for good outputs.”

Jarod Kasner, an assistant chief in Kent, Wash., said his department uses Vintra software. He said he was not aware of the co-appearance feature and had to consider whether it was legal in his state, one of the few that restricts the use of facial recognition.

“We’re always looking for technology to help us because it’s a powerful multiplier” for a department struggling with staffing issues, he said. But “we just want to make sure we’re within the boundaries to make sure we’re doing it properly and professionally.”

The Lee County Sheriff’s Office in Florida said it uses the Vintra software only on suspects and not “to track people or vehicles that are not suspected of any criminal activity.”

The Sacramento Police Department said in an email that it uses Vintra’s software “very much, if at all” but would not say whether it uses the co-appearance feature.

“We are in the process of reviewing our contract with Vintra and whether to continue using its service,” the department said in a statement, also saying it could not point to instances where the software helped solve crimes. .

The IRS said in a statement that it uses the Vintra software “to more efficiently review long video footage for evidence while conducting criminal investigations.” Officials would not say whether the IRS uses the co-appearance tool or where it has cameras posted, but only that it follows “established agency protocols and procedures.”

Jay Stanley, an attorney with the American Civil Liberties Union who first highlighted Vintra’s video presentation last year in a blog post, said he’s not surprised that some companies and departments are fearful about the use it. In his experience, police departments often deploy new technology “without saying, especially asking for, and allowing democratic overseers like city councils.”

The software could be abused to monitor personal and political associations, including potential intimate partners, labor activists, anti-police groups or partisan rivals, Stanley warned.

Danielle VanZandt, who analyzes Vintra for market research firm Frost & Sullivan, said the technology is already being used. Because he reviewed confidential documents from Vintra and other companies, he is under nondisclosure agreements that prohibit him from discussing individual companies and governments that may be using the software.

Retailers, already gathering massive amounts of data on people walking into their stores, are also testing the software to find out “what else can it tell me?” VanZandt said.

That may include identifying family members of a bank’s best customers to ensure they are treated well, a practice that raises the possibility that those without wealth or family connections will be overlooked. .

“Those bias concerns are very large in the industry” and are being actively addressed through standards and testing, VanZandt said.

Not everyone believes that this technology will be widely adopted. Law enforcement and corporate security agents are often discovering that they can use less invasive technologies to obtain the same information, says Florian Matusek of Genetec, a video analysis company that works with the Vintra. That includes scanning ticket entry systems and cellphone data that have unique features but are not tied to individuals.

“There’s a big difference between, like, product sheets and video demos and the actual stuff being deployed in the field,” Matusek said. “Often users find that other technology can also solve their problem without going through or jumping through all the hassles of installing cameras or dealing with privacy regulation.”

Matusek said he was not aware of any Genetec clients using co-appearance, which his company does not provide. But he couldn’t help it.

Leave a Comment