As military-grade robotics get cheaper and more capable, they'll be armed and put on American streets. With enough data from social media giants, these machines can mitigate some, possibly many mass murder attacks.

A single man armed with an assault rifle felled eleven well-armed, well-trained police officers in Dallas this past Thursday, 7 July 2016, five of them fatally. Dozens of policemen and policewomen were at the ready to quell trouble in a potentially tense Black Lives Matter rally, and still the carnage happened.

If we’re supposed to believe the NRA and their political champions that more good men with guns is the answer to Sandy Hook massacres of first graders and educators, the question begs, just how many good men with guns does it take? Are we truly to expect better from educators while they and their children are being baptized in a hail of high powered bullets?

I can just hear Donald saying, “Fire all those losers and replace ‘em with combat hardened Marines. And while you’re at it, build concertina wire walls around schools and replace playgrounds with mine fields and machinegun nests.”
One man, one assault rifle, a hail of rounds and the Dallas police were convinced they were being attacked by multiple snipers. Yes, Mr. Trump, we need more guns on school grounds.
Blame Google et al.

Profiling, machine learning, and the increasing complicity of Google, FB, and all other data ingesting giants in mass murder. Everything you do is profiled: from your current location, your travel habits, your FB posts, your tweets, your Instagrams, your Google searches, your movie watching habits, your phone calls, your driving style, your pictures, and the rest of your digital self, and you are predictable.  

To wit, three random forest (machine learning) models for “Classifying Adult Probationers by Forecasting Future [High Risk] Offending” were developed in 2012 within a partnership between University of Pennsylvania-based researchers and Philadelphia’s Adult Probation and Parole Department.1

Using a relatively high number of predictors (down to zip code level detail) they arrived at the following random forest (more readable on page 35 of the report):


                                       Random Forest for Serious Offenders Modeling

Forecasted versus actual outcomes were compared at 2 and 5 years. At end of the model’s two-year time horizon, 21% of the forecasted High Risk group had fulfilled their prediction, and committed a new serious offense. This proportion can be compared to the 11% of forecasted Moderate Risk cases and 5% of forecasted Low Risk cases which defied the model’s predictions and instead became actually High Risk. After two years, [beyond the span of time forecasted by the model] the behavioral trends of the three forecasted risk group not only continue, but become more pronounced. Within five years of starting their new probation case, more than a third (36%) of the forecasted High Risk cases resulted in new serious offending, compared to just 20% of forecasted Moderate Risk and 10% of forecasted Low Risk case starts.2

Imagine these models being augmented with the rest of your digital profile.

Imagine these models for risk of rampage murderers. Even if early models only stop 10% of massacres, it’ll be worth it. And I don’t necessarily imply worth in some moralistic sense. Murdering costs money, big money. “The most violent and prolific offenders singly produced costs greater than $150-160 million in terms of victim costs, criminal justice costs, lost offender productivity, and public willingness-to-pay costs.”3

If Google and FB and Netflix and Instagram and Microsoft and are making money profiling you, shouldn’t they also be putting some back? Reducing your risk of being gunned down? Subject to class action lawsuits? Where’s your anger? Maybe the NRA should go after Google et al. Ralph Nader style.

The NRA Institute for Legislative Action, after all, tells us that “Americans overwhelmingly understand high-profile shootings as pointing to a problem with the country’s mental health system, rather than a lack of gun control laws.” (And you know how scientific we Americans are, with our creationism, global climate change denial, and our belief in astrology.)

In truth, we need data, desperately so. Rampage violence mass murderers don’t tend to survive their deadly 2nd Amendment tantrums for latter psychological analysis. And it can’t be all about mental health. “Individuals who commit violent or aggressive acts often do so for reasons unrelated to mental illness…. Research, in fact, confirms the error in associating dangerousness with mental illness, showing that ‘the vast majority of people who are violent do not suffer from mental illnesses. The absolute risk of violence among the mentally ill as a group is still very small and … only a small proportion of the violence in our society can be attributed to persons who are mentally ill.’”4

So where do we start with digital footprint predictors? Start with this: Mass murderers “…are isolates, often bullied in childhood, who have rarely established themselves in effective work roles as adults. They have personalities marked by suspiciousness, obsessional traits, and grandiosity. They often harbor persecutory beliefs, which may occasionally verge on the delusional.”5

In the meantime, while we sort out our 2nd Amendment rights over our right to life and privacy, get some insurance, buy bullet proof vests, home school your kids, build a bunker, have Amazon deliver your groceries, write your representative a letter, and you know, vote, and keep on giving the giants your digital crumbs.

(If you wish to quibble that civilian versions of the military M15 aren’t assault weapons, do so on your own time.)

 1. Classifying Adult Probationers by Forecasting Future Offending,” Geoffrey C. Barnes, Ph.D., Jordan M. Hyatt, J.D., M.S., March 2012, published by the U.S. Department of Justice, fund award 2008-IJ-CX-0024

2. Ibid.

3. “Predicting the Risk of Future Dangerousness” Phillipps, Robert T.M. Virtual Mentor. June 2012, Volume 14, Number 6: 472-476.

4. “The Autogenic (Self-Generated) Massacre,” Mullen, P.E. Behavioral Sciences and the Law, 2004, 22(3):311-23.