#22. Dataminr Introduces Racial Bias, Stereotypes in Policing of Social Media

by Project Censored

In an October 2020 article published by the Intercept, Sam Biddle reported on racial bias at Dataminr, a New York company that monitors Twitter for “suspicious behavior,” which it reports to law enforcement agencies. According to Biddle’s report, Dataminr’s Twitter surveillance program “targets communities of color.” Sources directly familiar with Dataminr’s work told the Intercept that, in Biddle’s words, Dataminr has “relied on prejudice-prone tropes and hunches to determine who, where, and what looks dangerous.” Dataminr’s “domain experts” lacked training, creating a powerful but untrustworthy monitoring system made in the image of their own prejudices and those of the law enforcement agents the system was intended to serve.

According to Dataminr’s own marketing materials, the company’s controversial First Alert program was created to notify “first responders to breaking events, enabling the fastest real-time response,” but, as one source told the Intercept, “Dataminr and law enforcement were perpetuating each other’s biases.”

As Biddle reported, First Alert’s monitoring staff “brought their prejudices and preconceptions along with their expertise.” Sources directly related to Dataminr told the Intercept that they were instructed to search for evidence of crime in specific neighborhoods and streets where a majority of residents were people of color. Furthermore, the Intercept reported that, according to sources familiar with the program, Dataminr’s anti-gang activity amounted to “white people, tasked with interpreting language from communities that [they] were not familiar with,” coached by predominantly white former law enforcement officials who themselves “had no experience from these communities where gangs might be prevalent.”

The tolls of policing people of color on social media include heightened levels of tension among those being policed and unnecessary or excessive use of force by authorities. Companies such as Dataminr show that the crisis of over-policing manifests not only in person, on the street, but also online. In a previous article for the Intercept from July 2020, Biddle wrote, “Dataminr relayed tweets and other social media content about the George Floyd and Black Lives Matter protests directly to police, apparently across the country,” despite Twitter’s official policy against “using Twitter data to derive or infer potentially sensitive characteristics about Twitter users.”

The Intercept contacted Lindsay McCallum, a spokesperson for Twitter, but she refused to discuss Dataminr’s surveillance of protesters, most of whom were peaceful. Instead, Dataminr contends that its law enforcement service only “delivers breaking news alerts on emergency events, such as natural disasters, fires, explosions and shootings.”

Biddle’s October 2020 report quoted Forrest Stuart, a sociologist heading the Stanford Ethnography Lab, who observed that Dataminr’s use of Twitter to infer gang affiliation is “totally terrifying.” Stuart said that research has established how often police officers lack the “cultural competencies and knowledge” required to understand the “behavioral and discursive practices, [and] aesthetic practices” of urban Black and Brown youth, but the “domain experts” employed by Dataminr lack “even the basic knowledge” that officers have regarding criminal behavior.

Almost all of the corporate media news coverage regarding Dataminr has been positive, highlighting the company’s business partnerships and financial successes without addressing the charges of racial bias raised by Biddle’s reports. In March 2021, for example, CNBC reported on the company’s estimated value having risen to more than $4 billion, and its CEO’s plans for a stock market launch in 2023. In September 2020 the Wall Street Journal published an article about Dataminr and how it “provided alerts to police and other government clients that included Twitter handles of users discussing plans for protests or where activists were blocking streets” during Black Lives Matter protests, noting that Twitter’s rules “prohibit partners from using its data for ‘tracking, alerting or monitoring sensitive events,’ specifically including protests and rallies.” Although the Journal’s report included concerns from privacy advocates about “what level of social-media monitoring qualifies as surveillance,” it failed to address how Dataminr’s service for law enforcement disproportionately targets communities of color. In July 2020, Mashable published a piece based on Biddle’s July 2020 Intercept article about how Dataminr helped police track protestors at Black Lives Matter events following the killing of George Floyd. Black Agenda Report republished Biddle’s October 2020 Intercept article one week after its original publication.

Sam Biddle, “Twitter Surveillance Startup Targets Communities of Color for Police,” The Intercept, October 21, 2020.

Student Researcher: Leslie Palacios (Diablo Valley College)

Faculty Evaluator: Mickey Huff (Diablo Valley College)

Illustration by Anson Stevens-Bollen.