seems like a dupe of <a href="https://news.ycombinator.com/item?id=17074148" rel="nofollow">https://news.ycombinator.com/item?id=17074148</a> which has more comments
dupe of <a href="https://news.ycombinator.com/item?id=17064776" rel="nofollow">https://news.ycombinator.com/item?id=17064776</a> (98 comments) (discussion of the Gizmodo article thats the source of this article)
I once worked for a large industrial group in Europe. The kind that has a bit of a piece of every little pie there is - military, transportation, etc. I was pretty happy working there .. until I got a demo from the 'defence' group.<p>They demonstrated the willingness to push the company's technology into heinous, heinous territory. The kind of thing where a drone would be able to follow a single person in a crowd, and target them for execution - unguided, of course.<p>I quit the next day. Those of us who make technology, need to be very sure we see that it is not used destructively against the human species. The responsibility is very, very high. And, the danger is extreme. These people were revelling in the fact that they could develop targeted assassination drones and sell them to any country in the world.<p>Heinous.
I’d like to offer the point of view that if the drones become better and more surgical in their precision, it would reduce civilian casualties.<p>Like it or not the world is full of extremists who would like nothing more than to hurt innocent people. There is no “oh just send the cops and arrest them!” route to take.<p>Shit, just look at the time Osama bin Laden could have been bombed with a tomahawk missile during Clinton’s presidency. He didn’t do it because of the potential to kill a Saudi prince he was meeting at that time.<p>Would those angry Googlers be against surgically killing Osama? I think not.<p>Better drone software might help track a potential target and present with the optimal window in which a target could be shot and have reduced civilian casualties. It could also present with better intel to let a surgical ground strike which would put more American soldiers at risk but would allow for better intel and again less civilian deaths.<p>Lastly, it could offer new knowledge and experience in tracking humans with drones during humanitarian disasters. It could also help in tracking victims of kidnapping, are the Googlers opposed to rescuing the hundreds and thousands kidnapped by Boko Haram and company?<p>Who is going to go into the African heart of darkness to rescue those people? Is it the arm chair Googlers who pretend to know better?
I wouldn’t say that developing military hardware necessarily negates the “don’t be evil” principal (especially if the developed articles are dual use, for both military and civilian applications). The western world, our principals and values have prospered for more than half a century in Pax Americana afforded, in a large part, by the prosperous US Military Industrial Complex.<p>I totally get the objection to developing combative AI - that’s a separate ethical question - but you can contribute to the military and still maintain your humane values.
Kudos to those who is ready to stand for their principles.<p>If you are at G and thinking whether you should resign or not, remembers this - the market for AI talent is super hot. You will immediately find lots of great and challenging AI work pushing humanity forward
I am worried we are not too far away from this becoming a reality: <a href="https://www.youtube.com/watch?v=TlO2gcs1YvM" rel="nofollow">https://www.youtube.com/watch?v=TlO2gcs1YvM</a>
> runs contrary to Google's ethos -- the mantra "don't be evil" has long been at the heart of Google's principles<p>Thinking that the military of your own country is "evil" seems a bit puerile to me. Watching too many movies and TV shows can have that effect.
I wish there was a legally-enforcable version of Douglas Crockford's "Good, not Evil" license. I haven't released any source code that could have military applications yet, but if I ever do, I want to make it 100% clear that it's <i>not</i> to be used for any task related to the killing or injuring of other people. We're in a unique position as programmers where even the tiniest bit of our code can affect thousands or millions of people across the globe, and this terrifies me.<p>The GPL has already shown us that a license has the power to change culture and behavior (in however small a way). We should be able to extend this approach to other values we hold dear.
I just want to congratulate those Googlers. It isn't something usual to quit a good job due to ethical concerns. The World would be a lot better if there were more people like you.
Somebody else will fill the gap. It is simply a consequence of military science: If a human in the loop makes combat systems less effective, then other countries will seek the advantage over others by removing the human from the loop. It's a classic arms race at this point.<p>... This is a pandoras box that has already been opened I am afraid.
I've been asked to work for military industry companies before. And I have always declined for ethical reasons.<p>But as I sit here and think about it, I wonder if its a good thing that a person like myself (that believes I'm on the ethical high-ground) decline these types of jobs.<p>Someone is going to take the job. Perhaps someone less skilled than myself, perhaps someone less ethical than myself ? What is the result of that ?<p>As another poster wrote, it's "good" that the targetting gets more precise, meaning less collateral damage.<p>But to each his own. We need to be able to sleep at night aswell. And that to me also seems like a really good reason to decline.<p>I'm kind of on the fence about wanting to work in that industry.
some incoherent ramblings...<p>I would be curious to see the list of employees who quit for this purpose. They are making a statement. They might as well publicly disclose their identities to inspire more people.<p>Also, wondering. Are most of them financially independent to have made this decision? When money is not an worry, people have freedom to truly align themselves externally with their internal core values. If you are constantly worried paying rent or securing your kids future - people make compromises. That is not ideal to build a great society.
> Google recently sponsored the Conservative Political Action Conference, for instance.<p>Of course, there would be no conscientious objectors if it were a liberal event