Revealed: the software that studies your Facebook friends to predict who may commit a crime

Recevez des mises à jour en temps réel directement sur votre appareil, abonnez-vous maintenant.

What do your Facebook posts, who you follow on Instagram and who you interact with the most on social media say about you? According to the tech startup Voyager Labs, that information could help police figure out if you have committed or plan to commit a crime.

Voyager Labs is one of dozens of US companies that have popped up in recent years with technology that purports to harness social media to help solve and predict crime.

Pulling information from every part of an individuals various social media profiles, Voyager helps police investigate and surveil people by reconstructing their entire digital lives public and private. By relying on artificial intelligence, the company claims, its software can decipher the meaning and significance of online human behavior, and can determine whether subjects have already committed a crime, may commit a crime or adhere to certain ideologies.

But new documents, obtained through public information requests by the Brennan Center, a non-profit organization, and shared with the Guardian, show that the assumptions the software relies on to draw those conclusions may run afoul of first amendment protections. In one case, Voyager indicated that it considered using an Instagram name that showed Arab pride or tweeting about Islam to be signs of a potential inclination toward extremism.

The documents also reveal Voyager promotes a variety of ethically questionable strategies to access user information, including enabling police to use fake personas to gain access to groups or private social media profiles.

michel moore
Michel Moore, chief of the Los Angeles police department, which documents show has worked or considered working with a number of companies producing hi-tech policing tools. Photograph: Jason Armond/Los Angeles Times/REX/Shutterstock

Voyager, a nine-year-old startup registered as Bionic 8 Analytics with offices in Israel, Washington, New York and elsewhere, is a small fish in a big pond that includes companies like Palantir and Media Sonar. The Los Angeles police department trialed Voyager software in 2019, the Brennan Center documents show, and engaged in a lengthy back-and-forth with the company about a permanent contract.

But experts say Voyagers products are emblematic of a broader ecosystem of tech players answering law enforcements calls for advanced tools to expand their policing capabilities.

For police, the appeal of such tools is clear: use technology to automatically and quickly see connections that might take officers much longer to uncover, or to detect unnoticed behaviors or leads that a human might not pick up on because of lack of sophistication or capacity. With immense pressure on departments to keep crime rates low and prevent attacks, using technology to be able to make fast and efficient law enforcement decisions is an attractive value proposition. New and existing documents show the LAPD alone has worked or considered working with companies such as PredPol, MediaSonar, Geofeedia, Dataminr, and now Voyager.

But for the public, social media-informed policing can be a privacy nightmare that effectively criminalizes casual and at times protected behavior, experts who have reviewed the documents for the Guardian say.

As the Guardian previously reported, police departments are often unwilling to relinquish the use of those tools even in the face of public outcry and in spite of little proof it helps to reduce crime.

table of contents

Experts also point out that companies like Voyager often use buzzwords such as artificial intelligence and algorithms to explain how they analyze and process information but provide little evidence that it works.

A Voyager spokesperson, Lital Carter Rosenne, said the companys software was used by a wide range of clients to enable searches through databases but said that Voyager did not build those databases on its own or supply Voyager staffers to run its software.

These are our clients responsibilities and decisions, in which Voyager has no involvement at all, Rosenne said in an email. As a company, we follow the laws of all the countries in which we do business. We also have confidence that those with whom we do business are law-abiding public and private organizations.

Voyager is a software company, Rosenne said in answer to questions about how the technology works. Our products are search and analytics engines that employ artificial intelligence and machine learning with explainability.

Voyager did not respond to the detailed questions about who it has contracts with or how its software draws conclusions on a persons support for specific ideologies.

LAPD declined to respond to a request for comment.

A guilt-by-association system

The way Voyager and companies like it work is not complicated, the documents show. Voyager software hoovers up all the public information available on a person or topic including posts, connections and even emojis analyzes and indexes it and then, in some cases, cross-references it with non-public information.

Internal documents show the technology creates a topography of a persons entire social media existence, specifically looking at users posts as well as their connections, and how strong each of those relationships are.

The software visualizes how a persons direct connections are connected to each other, where all of those connections work, and any indirect connections (people with at least four mutual friends). Voyager also detects any indirect connections between a subject and other people the customer has previously searched for.