PHILIP K DICK'S DEPARTMENT OF PRE-CRIME A REALITY... BUT IT HAS A COUPLE OF "BUGS"

PHILIP K DICK’S DEPARTMENT OF PRE-CRIME A REALITY… BUT IT ...

October 31, 2016 By Joseph P. Farrell

Occasionally I've blogged about the increasing reliance of various law enfarcement ... er... enforcement... departments using computer algorithms not only to predict crimes, but also to profile those who might be inclined to become criminals. If it sounds like the science fiction fantasies of sci-fi master author Philip K. Dick, that's because it is. Dick's fiction has, of course, been the subject of numerous movies, from Bladerunner, to Total Recall, to the Tome Cruise movie Minority Report, which is all about psychic prediction of crime, and about people being arrested before they even commit the crime itself. OK, it's not computer prediction or profiling, but it might as well be, if this article shared by Mr. T.M. from Mother Jones is any indicator:

The Legal System Uses an Algorithm to Predict If People Might Be Future Criminals. It's Biased Against Blacks.

Before we get to the high octane speculation of the day, in order to have a backdrop for those speculations, consider these paragraphs from the article:

On a spring afternoon in 2014, Brisha Borden was running late to pick up her god-sister from school when she spotted an unlocked kid's blue Huffy bicycle and a silver Razor scooter. Borden and a friend grabbed the bike and scooter and tried to ride them down the street in the Fort Lauderdale suburb of Coral Springs.Just as the 18-year-old girls were realizing they were too big for the tiny conveyances—which belonged to a 6-year-old boy—a woman came running after them saying, "That's my kid's stuff." Borden and her friend immediately dropped the bike and scooter and walked away.

But it was too late—a neighbor who witnessed the heist had already called the police. Borden and her friend were arrested and charged with burglary and petty theft for the items, which were valued at a total of $80.

Compare their crime with a similar one: The previous summer, 41-year-old Vernon Prater was picked up for shoplifting $86.35 worth of tools from a nearby Home Depot store.

Prater was the more seasoned criminal. He had already been convicted of armed robbery and attempted armed robbery, for which he served five years in prison, in addition to another armed robbery charge. Borden had a record, too, but it was for misdemeanors committed when she was a juvenile.

Yet something odd happened when Borden and Prater were booked into jail: A computer program spat out a score predicting the likelihood of each committing a future crime. Borden—who is black—was rated a high risk. Prater—who is white—was rated a low risk.

Two years later, we know the computer algorithm got it exactly backward. Borden has not been charged with any new crimes. Prater is serving an eight-year prison term for subsequently breaking into a warehouse and stealing thousands of dollars' worth of electronics.

And as if that is not bad enough, the anti-black bias of these programs appears to be endemic, and one suspects, perhaps even innate or even deliberately pre-programmed:

In forecasting who would re-offend, the algorithm made mistakes with black and white defendants at roughly the same rate but in very different ways.

  • The formula was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.
  • White defendants were mislabeled as low risk more often than black defendants.

Could this disparity be explained by defendants' prior crimes or the type of crimes they were arrested for? No. We ran a statistical test that isolated the effect of race from criminal history and recidivism, as well as from defendants' age and gender. Black defendants were still 77 percent more likely to be pegged as at higher risk of committing a future violent crime and 45 percent more likely to be predicted to commit a future crime of any kind. (Read our analysis.)

The algorithm used to create the Florida risk scores is a product of a for-profit company, Northpointe. The company disputes our analysis.

In a letter, it criticized ProPublica's methodology and defended the accuracy of its test: "Northpointe does not agree that the results of your analysis, or the claims being made based upon that analysis, are correct or that they accurately reflect the outcomes from the application of the model.

Northpointe's software is among the most widely used assessment tools in the country. The company does not publicly disclose the calculations used to arrive at defendants' risk scores, so it is not possible for either defendants or the public to see what might be driving the disparity.

In a way, one is looking at the judicial equivalent of those computer voting algorithms, and in a season where the reports of voting fraud are coming in fast and furious, to the extent that many states are reporting such a pro-Darthillary voting bias, that they have moved to paper ballots. (Maybe that's because Darth Soros owns stock in companies providing such machines to twenty-two US states, by some estimates. But not to worry! We have the Department of Homeland (In)security and the UN supervising elections, because according to Fearless Leader, such a thing as voting fraud simply doesn't exist in the USSA). In this case, the bias against blacks, as Mother Jones is alleging, is clear and palpable, and it is being used for "sentencing guidelines." Of course, having such biased algorithms around is a pretty handy thing, especially when one is trying to fill up the private for-profit prisons that the same bunch of Neocons foisted on the country back in the late 80s and throughout the 1990s. (And I'll give you two guesses about which two famous political families were pushing this.) And like those computerized voting machines or the computerized individual adaptive assessments of Common Core, we're not allowed to see the algorithms because they are, of course, proprietary. All of this to my mind argues persuasively that it's high time for the American fascination with technology in the classroom, voting booth, and courtroom, to come to a screeching halt. It's time to end it. But there's another reason to be concerned...

...which brings me to my high octane speculation of the day. In the past few years the USSA has been under increasing cyber attacks from... well, from somewhere. We've seen attacks on Sony, on major banks, even on the US Federal Reserve, and most recently, an almost complete takedown of the the internet on the east coast via DOS attacks. I've also blogged recently about the fact that some US intelligence analysts are concerned that all of these cyber attacks are really reconnoitering operations designed to "map the internet," possibly for the ability to launch cyber attacks that can take the whole thing - or at least a significant segment of it - down. All of this suggests that these systems are far from secure. Hillary Clinton's campaign has strongly suggested that Russia is behind much computer voting hacking. Of course, the idea in itself is ludicrous, as Mr. Putin has recently pointed out. But by raising the issue, she has really suggested that none of these systems is secure. Want to rig an election? Hack it. Want to influence the outcome or sentencing stage of a trial? Hack it. Want to interfere with international financial clearing? Hack it. Want to influence a stock or commodities market and the high frequency trading algorithms that now dominates them? Hack it. Want to manipulate international currency markets? Hack it.

Raising the issue of markets indicates the problem: for if trading is increasingly done by high frequency trading algorithms, how can it be maintained that these markets are genuinely reflective of actual human market conditions? Similarly, how can a computer algorithm reflect human elements in sentencing? Or for that matter, in profiling? The Mother Jones article notes that questionnaires given to defendants forms a component of the process of computer evaluation and profile scoring. And of course, this means to a certain extent that the multiple choice standardized test has now sprouted in the justice system. It will only be a matter of time before computer monitors are put in the jury boxes to "guide and inform" jurors on their deliberations.  The biases inherent in the programs, as pointed out by Mother Jones, are also there for anyone to see who has examined their appearance in standardized tests.

It's high time to dethrone the computer, and put humanity back into trading, into teaching, into assessments, and into evaluations of guilt, innocence, and sentencing.

See you on the flip side...