AI: No Problem More Human than War
Artificial Intelligence is moving its way out of Science Fiction stories and into reality. And just like ALL those SciFi movies, it is moving quicker than most of us expect and even fewer of us can control. A common theme throughout these flicks, even the bad ones, is that the machines suddenly become self-aware and then take over. The Terminator series, good old HAL in 2001: A Space Odyssey, WarGames. Over and over, the human race is surprised when it happens. We always look so stupid. It’s almost comical.
Newsflash: It's starting. It’s actually happening now. Gaming. Siri. Alexa. Adaptive cruise control in your car. Chat GTP. On a daily basis, human beings are relinquishing more decisions to machines. And we are doing it with a happy, stupid grin because we are LAZY. Don’t know where to go to dinner? Ask Siri. Can’t figure out what to watch on television? Netflix will tell you. Too tired to drive but really need to get there? Autopilot in your car. Need to write an essay or a speech? Chat GTP.
What happens when the global militaries integrate Artificial Intelligence into planning? Wait, they already are. Scenario development and gaming is happening. You remember in the movie WarGames when the WOPR ran the nuclear scenario over and over at increasing speed? We have that capability now. What happens when AI is brought into the decision-making process? Not just scenarios, but actual tactical, operational, and strategic decisions? Scary thought.
Drone technology is evolving. Did you know they are tethering drone tanks to live ones? Conceptually, it is pretty amazing; a tank with a live crew does something and their tethered “battle buddy” mimics them. One crew…two tanks. More firepower on the battlefield with less people. How long before you think that drone tank gets to make decisions on its own? I mean, if the live crew gets killed, don’t you think the Army is going to want that drone tank to keep fighting? What about aircraft? Fighters and helicopters?
I have very, very little faith the tech community that is developing this technology will have the restraint or the consciousness to NOT be enamored by what they are doing and actually keep it under control. I have even LESS faith the militaries of the world, especially OURS, will not jump in with both feet to use AI to maximum capability WITHOUT understanding it fully and/or being able to CONTROL it.
Don’t scoff this one away. Think about it. Pause and REALLY think about it. Do you have that much faith in the Department of Defense? The same one that can’t keep track of billions of taxpayer dollars? The one that cannot help but salivate anytime they are shown the newest and coolest piece of tech? You really believe they will be able to take a pause and say, “maybe this isn’t a good idea”? I don’t.
This is why we have commanders in the military. They make decisions. And with those decisions, they are held responsible. When they make bad decisions, we used to fire them. We don’t do that anymore. So, if we aren’t going to hold them accountable for the decisions they make…why do we need them? Why not let AI make the decisions?
What happens, what is already happening, is we are removing the HUMANITY from our lives and, in this case, from combat. I know, the uninitiated are asking “Humanity in combat? Is that a thing?” Yeah. Yeah, it is. Without humanity, you have atrocities. You have war crimes. You have Armageddon. You have the holocaust. And when you let machines decide, that’s what you get, a machine answer to a very human problem.
And there is no problem more human than war.