Dataminr Introduces Racial Bias, Stereotypes in Policing of Social Media

by Vins
Published: Updated:

In an October 2020 article published by the Intercept, Sam Biddle reported on racial bias at Dataminr, a New York company that monitors Twitter for “suspicious activity” that it reports to law enforcement agencies. According to Biddle’s report, Dataminr’s Twitter surveillance program, First Alert, “targets communities of color.” Sources at Dataminr told the Intercept that, in Biddle’s words, Dataminr has “relied on prejudice-prone tropes and hunches to determine who, where, and what looks dangerous.” Dataminr’s “domain experts” lacked training, creating a powerful but untrustworthy monitoring system that targeted people of color.

Dataminr’s controversial First Alert program was created to notify “first responders to breaking events, enabling the fastest real-time response,” but, as one source told the Intercept, “Dataminr and law enforcement were perpetuating each other’s biases.”

As Biddle reported, First Alert’s monitoring staff “brought their prejudices and preconceptions along with their expertise.” Sources directly related to Dataminr told the Intercept that they were instructed to search for evidence of crime in specific neighborhoods and streets where a majority of residents were people of color. Furthermore, the Intercept reported that, according to sources familiar with the program, Dataminr’s anti-gang activity amounted to “white people, tasked with interpreting language from communities that we were not familiar with” coached by predominantly white former law enforcement officials who themselves “had no experience from these communities where gangs might be prevalent.”

The tolls of policing people of color on social media include heightened levels of tension among those being policed and unnecessary or excessive use of force by authorities. Companies like Dataminr show the crisis of over-policing manifests not only in person on the ground, but also online. Biddle wrote, “Dataminr relayed tweets and other social media content about the George Floyd and Black Lives Matter protests directly to police, apparently across the country,” despite Twitter’s official policy against “using Twitter data to derive or infer potentially sensitive characteristics about Twitter users.”

The Intercept contacted Lindsay McCallum, a spokesperson for Twitter, but she refused to discuss Dataminr’s surveillance of protesters, most of whom were peaceful. Instead, Dataminr contends that its law enforcement service only “delivers breaking news alerts on emergency events, such as natural disasters, fires, explosions and shootings.”

Biddle’s report quoted Forrest Stuart, a sociologist and head of the Stanford Ethnography Lab, who observed that Dataminr’s use of Twitter firehose to infer gang affiliation is “totally terrifying.” Stuart said that research has established how often police officers lack the “cultural competencies and knowledge” required to understand the “behavioral and discursive practices, aesthetic practices” of urban Black and Brown youth, but the “domain experts” employed by  Dataminr lack “even the basic knowledge” that officers have regarding criminal behavior.

Almost all of the corporate media news coverage regarding Dataminr has been positive, highlighting the company’s business partnerships and financial successes, without addressing the charges of racial bias raised by Biddle’s reports. In March 2021, for example, CNBC reported on the company’s estimated value having risen to more than $4 billion and its CEO’s plans for a stock market launch in 2023. In September 2020, the Wall Street Journal published an article about Dataminr and how it “provided alerts to police and other government clients that included Twitter handles of users discussing plans for protests or where activists were blocking streets” during Black Lives Matter protests, noting that Twitter’s rules “prohibit partners from using its data for ‘tracking, alerting or monitoring sensitive events,’ specifically including protests and rallies.” Although the Journal’s report included concerns from privacy advocates about “what level of social-media monitoring qualifies as surveillance,” it failed to address how Dataminr’s First Alert program disproportionately targets communities of color. In July 2020, Mashable published an article based on a previous report by Biddle about how Dataminr helped police track protestors at Black Lives Matter events following the killing of George Floyd. In October 2020, the Black Agenda Report republished Biddle’s Intercept report.

Source: Sam Biddle, “Twitter Surveillance Startup targets Communities of Color for Police,” The Intercept, October 21, 2020, https://theintercept.com/2020/10/21/dataminr-twitter-surveillance-racial-profiling/.

Student Researcher: Leslie Palacios (Diablo Valley College)

Faculty Evaluator: Mickey Huff (Diablo Valley College)