Many months ago, in one of her quarterly wrap up interviews with me, Catherine Austin Fitts proposed and outlined the idea that behind the move to recognize "transgenderism" was a move to be able to recognize robots as legal persons for tax purposes, i.e., that any work done by robots for a corporation (or private individual), was taxable income on a per-robot basis. To put it country simple, expect "joint filing with robot but not spouse" or "filing individually but spouse filing jointly with family robot" boxes on your future tax forms. I received a lot of email on that one! More recently, we've seen some in politics calling for the defunding of local police departments, and I strongly suspect one reason is that is becomes a way for hard-pressed cities to save money by "hiring" robots. Why hire a human when a robot is cheaper and wouldn't need health insurance nor join unions? All it would need would be a mechanic and a robot "garage."
Of course, that is a nightmare scenario, as a robot isn't human and doesn't possess human subtlety. That speeding ticket the officer let you skate on because you were five miles over the speed limit coasting down a hill while you were trying to calm the kids in the back seat will be a thing of the past. One percentage of a mile over the speed limit, and Robocob tickets you. And if you think that would be an inhuman - or rather, anti-human - world, just imagine a Ms. Robo-Judge in the divorce court. But of all human activities that should scare the pants off of us for AI to become involved in, it's war. We've all heard the horror stories of children getting gunned down by trigger happy people for playing with toy guns, and I have to wonder if the ban-the-toy-gun hysteria is due to some other cause, like rolling out Mr. Robocop which would shoot anything wielding what looked like a gun in its pattern recognition software.
I raise the question because that prospect may be much closer than one might imagine, according to this short story in Sputnik shared by A.C.M.:
Note what was accomplished here:
An artificial intelligence “pilot” just swept the floor with its human adversary in a set of simulated dogfights hosted by the Pentagon’s Defense Advanced Research Projects Agency (DARPA).
The Thursday showdown between man and machine didn’t bode well for the future of flesh-and-blood aircraft pilots, as one of the US Air Force’s top F-16 fighter pilots failed to shoot down an AI-controlled adversary in even one of the five matches.
Now, automatic pilot systems - artificial intelligence of a sort - have been around for a while, and some of them are reportedly capable of landing an aircraft if the flight crew is disabled. Already, not far from where I live, there are freight trains running short distance hops between the main yard and small towns are run entirely by AI; there is no engineer in the cab of the locomotive, and indeed, on the entire train there is no human presence. The transition has been happening before our eyes. I'm old enough to remember getting into elevators that were literally run by a human, sitting on a chair, manipulating analog controls. A few years later, they were gone. Many radio stations are almost completely automated.
But dogfights are a considerably more complex operation, involving choice of maneuver, weapons packages, and - for the human - how many G forces one can comfortably endure, which a robot or AI has no difficulty performing. It will not be long until human pilots are a thing of the past. And what applies to air forces in some immediate future, will be extended to navies and armies. Already Russia's newest main battle tank, the Armata, has an almost entirely automated turret and targeting system, reducing the crew of the vehicle to three, down from the typical five man crew with two in the turret. The US Air Force and many other militaries increasingly rely on drones for interdiction purposes, which are remote controlled. With AI, those "guys in the remote control bunker" could be eliminated as well, or at least significantly reduced in number, as AI can perform typical military operational functions more efficiently, faster, and not subject to the frailties of human nature.
I mention all of this because this "dogfight simulation" is a portent of what happens when significant portions of warfare become more and more automated. I'm not suggesting that the human element of warfare will be eliminated. Far from it.
I'm suggesting something far more horrific, that as the human component in the actual conduct of war increasingly diminishes, non-human military targets will increasingly diminish, as human targets correspondingly increase as a measure of will to continue a resistance in a war. It's the firebombing of Dresden or Tokyo, without human pilots in the bombers, but with definite human targets.
And of course, this also means that the weapon of the future, is the expert hacker, which might also be an AI...
See you on the flip side...