As killer robots are nominally no different than land mines I could see support for banning them. My understanding however is that land mine use (or area denial weapons), are <i>still</i> allowed if there is a way to definitively disable them at the end of hostilities. If my understanding is correct, and robot weapon designers are able to successfully counter with the notion "but we can turn them off after we're done." then this effort won't go as far.<p>At a conference over the weekend in one of the couch discussions there was a suggestion of a 'nearness' limit, sort of you can't use deadly force unless you are within a 10 mile radius of that use. The goal being to outlaw more developed countries flying drones over less developed countries and picking off their citizens.
Requiring a human to be in the loop in all circumstances is impractical. Communications can be disrupted. Autonomy is also a software issue. It's easy for a country to say they have humans in a loop, then in a real war it would be trivial to change it.<p>And "robots" are not different than any other weapon. Bullets and missiles can be aimed but they don't discriminate and they can end up (and often do) hitting civilians and unintended targets. Land mines don't discriminate at all. And what difference does it make if you cover an area with land mines or put a autonomous turret to watch it instead? I'd argue the turret is better since it can have at least some ability to distinguish enemies from civilians and wildlife, and can be removed much easier after the conflict is over.
Couldn't they just ban killer humans? That would prevent most of war deaths. Shouldn't we part with this barbaric notion that killing someone is ok, as long as the killer is in the army?
So, here's the nasty undercurrent to all this, right?<p>Drone warfare (whether by land, sea, or air) is about using disposable machines to kill and injure human beings. Engineering dictates that we'll eventually optimize away the part of the control system that is slowest and most prone to failure: the people.<p>The discussion of "How can we keep people running the robots" is uninteresting, because the entire deck is stacked to guarantee that it will be rendered moot.<p>~<p>The real discussion--I posit--is somewhat darker and more chilling:<p>In order to field a drone army, you need capital. You need factories to build the devices, you need command and control infrastructure to deploy them, and you need bright minds to develop them. Drone warfare is difficult to conduct in any meaningful fashion as a third-world nation, or more importantly <i>as a populace in rebellion</i>.<p>To put it bluntly, the use of these engines of war is limited only to the rich kids, and there is no chance for appeal or mercy when you are identified as a target.<p>Think about that for a second.<p>The wealthy murderer who decides to unleash these does so without any skin in the game, without any chance of dealing with repercussions back home for lost sons and daughters, without any care whatsoever except for a line-item expense. Stubborn rebel holdout? Spin up more terminators the same way we spin up dynos to deal with spikes in load.<p>The teenage kid holding the rusted AK their parent just dropped, looking at the robot which just made them an orphan? No chance in hell that they'll be spared because they are obviously not a threat--they are a human wielding an automatic rifle, p = .975, execute.<p>~<p>This whole thing needs to become verboten, forbidden, the same way we nominally treat chemical and biological weapons.<p>If we support our .gov and .mil in the use of these weapons, we'll be doing everyone a disservice, and come the day we decide to rescind the support which backs those bastards, we'll find that they no longer need our support for they already have the drones and the capital to make their whims felt.
Banning Killer Robots is a great. But I'm fairly certain they are built by Killer Humans. These are just a natural extension of a society that values warfare and hegemony over social justice and peace. Note the differences in robotics applications between countries per the number of wars they've recently engaged in.
We probably should ban them at some point. However, personally I'd like to see an arm's race for the next couple of decades. Massive military spending could fund the R&D needed to get us to commercial uses. Jet fighters, and breaking the sound barrier were driven by the military, for example.<p>Having said that, I must say that I'm actually quite uncomfortable with machines deciding when to pull the trigger. Robocop is etched in my mind. Hope the remake keeps that scene.
Ah, yes, ban everything that you don't like. That would get rid of it for sure.
Chemical and biological weapons are banned? Yes.
Do we have them? Yes.
Will we use them to survive? FUCK YES.<p>So stop this meaningless "Geneva talks". "Killer robots" will be built and will be used.<p>Don't ban weapons. Ban wars. I don't see why 1st world countries would need them anyway.
This is such a fascinating topic because the field of robotics outside of industrial applications is still so nascent. I'm not here to weigh in, just to drop off some useful tidbits for other people interested in this ethical gray area as well.<p>This theme of "terminators" and "killer robots" has been really prevalent in the field lately because of the DARPA Robotics Challenge[1], the latest in DARPA's Grand Challenges, which I'm currently participating in on one of the Track B teams. Many people see the break away from bomb-squad bots and factory floor robotic arms in to humanoids as a really scary thing, and the DRC seems to amplify that (if a robot can hold a sawzall, it can hold a rifle).<p>Just last month in Atlanta, the IEEE Humanoids conference took place. Dr. Ronald C. Arkin[2] gave a talk exactly on this topic from an ethics perspective, titled "How to NOT build a Terminator", and it was exceptional. It was a plenary talk and not a paper, so I can't really find any record of it to share. Unfortunate.<p>Tangentially, the lab that I work in (the robotics group at the Florida Institute for Human and Machine Cognition[3][4]) has employed and continues to employ a principle that we call "co-active design"[5] where we actively work to keep a human in the loop at all times; we're definitely not looking to build a "killer robot". It's an interesting design problem that overlaps a lot with UI and UX, popular topics here on HN.<p>And lastly, a shameless plug for the field itself; a lot of people don't realize just how software-oriented robotics research (especially humanoids, where the fun problems are) is. A lot of people are stuck on it being a hardware endeavor. While it's true that a chunk of robotics falls in the mechanical engineering domain, there's plenty of room for hackers from tons of different disciplines to get involved. There's interesting people solving interesting problems, from the Open Source Robotics Foundation (the Willow Garage spin-off) to private groups like Boston Dynamics, yet so many people still see the field as a black-box that only opens up for the hardware inclined. I could see a talented group with the right hacker mindset doing some really interesting stuff in robotics with the right impetus and execution.<p>1:<a href="http://theroboticschallenge.org" rel="nofollow">http://theroboticschallenge.org</a><p>2:<a href="http://en.wikipedia.org/wiki/Ronald_C._Arkin" rel="nofollow">http://en.wikipedia.org/wiki/Ronald_C._Arkin</a><p>3:<a href="http://ihmc.us" rel="nofollow">http://ihmc.us</a><p>4:<a href="http://robots.ihmc.us" rel="nofollow">http://robots.ihmc.us</a><p>5:<a href="http://www.jeffreymbradshaw.net/publications/20101008_CoactiveDesign.pdf" rel="nofollow">http://www.jeffreymbradshaw.net/publications/20101008_Coacti...</a>
I think this campaign suffers from something of a branding problem in that "killer robots" theoretically includes human-controlled robots.<p>They need to be extremely up-front that what they're opposed to is <i>autonomous</i> battlefield robots. From the title/link, I assumed they opposed all battlefield robots (like the ridiculous drone) crowd, which is an absurd and extreme position.<p>Their actual position, of opposing autonomous robots, is a lot more sensible and should be their sole focus.