Elon Musk is scared of killer robots. Should you be?
Robot walking

Elon Musk is scared of killer robots. Should you be?

Posted August 21, 2017 12:38 PM by Logan Albright Robot walking
MATJAZ SLANIC | Getty Images
    • Font Size
    • A
    • A
    • A


Elon Musk is joining a panel of experts in calling for a global ban on the development of “killer robots,” including automated weapons of all kind — from machine guns, to tanks, to drones. At first glance, this seems like common sense.

We’ve all seen the “Terminator” movies, and nobody wants violence and death raining down at the hands of soulless automata.

But this is the problem with big-government policies: They always consist of little more than wishful thinking disguised as policymaking. Sure, we can all agree that we don’t want machines to kill humans en masse, but what would be the practical effect of such a ban, and would it actually accomplish what its supporters claim?


How do we stop the mainstream media from warping the national narrative? We push back together. With the truth.

Get CRTV’s free weapon against the worst the media has to offer delivered to your inbox daily: WTF MSM!?

* indicates required

A robot by any other name

First of all, it’s important to clarify what we mean by “robot.” We’ve come to think of robots as intelligent, human-like androids who walk, talk, and kill like ordinary humans. But in reality, a robot is little more than a tool.

Robots build cars on assembly lines, and ring you up at the grocery store. Deriving from the Czech word for “worker,” and first coined by science fiction writer Karel Capek in 1920, robots are really just mechanical helpers, everything from your toaster to your Roomba.

What Elon Musk appears to mean by the term is robots that have some degree of automation — that is, which can run themselves with minimal human involvement. It is reasonable to be cautious about these devices, but the term “robot” is misleading in the public imagination, drawing more from science-fiction films than from reality.

Additionally, the fears over artificial intelligence taking over the world are, in my opinion at least, highly overstated.

After decades of work, we appear no closer to actually creating machines that can think in the same way we think. Self-awareness remains mysterious and elusive, and the famous Turing test (the test to determine whether a machine can consistently fool a human into thinking it has a mind) remains unconquered. If artificial intelligence is even possible, it remains a long way in the future.

When killer robots are outlawed, only outlaws will have killer robots

As with any deadly weapon, we all share the same concerns about putting lives in danger. But it is unrealistic to assume a ban will stop all bad actors. What actually happens with weapon bans is that law-abiding citizens are left defenseless, while criminals flout the law, and gain a huge advantage over the general population.

Musk warns that once the Pandora’s box of automated weapons is opened, there’s no closing it. He’s right, but he’s foolish to think that a UN ban will stop the box from opening in the first place. You can’t stop technological development by fiat, especially when bad actors have a chance to get a leg up on their enemies.

The only way to defend ourselves against new weapons is to balance the power between attackers and defenders, and figure out ways to apply new methods toward self-defense.

There are also concerns about global governance and whether a ban would violate national sovereignty and constitutional principles.

The Second Amendment protects the American people’s right to keep and bear arms; there’s nothing to indicate that we ought to exclude robot arms. Ceding sovereignty to an international organization could leave us defenseless against other, less scrupulous countries.

I do not mean to sound bellicose, and the last thing I want is war of any kind. But it’s unrealistic to assume that that words on a piece of paper will halt the inevitable march of technology, and prevent people from developing automated weapons.

The bottom line is that robot weapons are going to happen one way or another, and the more pertinent question is, not how to stop them, but how to defend against them.

Logan Albright is a researcher for Conservative Review and director of research for Free the People. You can follow him on Twitter @loganalbright73.