Unfortunately, the divide between Silicon Valley and the Pentagon will be bad for national security in the long run. Recent news cycles keep revealing yet another piece military equipment suffering from software weaknesses. Technology does not stand still, if Russia and China have more of their most talented engineers willing to work on national projects than us that will inevitably lead to a competence gap.<p>Here is an interesting article about the cultural divide and how it can be mediated:<p><a href="https://www.defenseone.com/ideas/2018/12/divide-between-silicon-valley-and-washington-national-security-threat/153562/" rel="nofollow">https://www.defenseone.com/ideas/2018/12/divide-between-sili...</a>
> The runners-up in the vote for the 2018 Arms Control Persons of the Year were the founders and co-chairs of the International Gender Champions Disarmament Impact Group... The impact group developed specific aims for expanding knowledge about the importance of gender issues and practical actions for bringing gendered perspectives into disarmament discussions.<p>I find it just a little bit hard to take them seriously if they feel this is the second most significant group in arms control last year.
There is a lot of cynicism here in response to a strictly positive social effort. If we (nationally) are actually behind in some kind of "AI" drone-targeting arms race, I suspect a great many of these objectors would be willing to do work to, say, disable incoming drones, or develop systems to scramble/mislead automated targeting systems.<p>I feel that it is the responsibility of every moral and conscious agent to oppose dark patterns and negative trends within their place of work whenever possible, and while it is easy (and apropos) to accuse Google of perpetrating malicious patterns, I think we ought to laud and publicly encourage internal currants that oppose that trend, not smirk at them or deride them for not doing enough.
Isn't it inevitable that AI will be used to improve targeting technology for weaponry? Am I wrong in assuming that if U.S. doesn't develop this tech some other country will?
>The second runner-up was South Korean president Moon Jae-in. He was nominated for promoting improved Inter-Korean relations and a renewed dialogue between Washington and Pyongyang on denuclearization and peace that has led to a number of significant steps to decrease tensions, including a North Korean moratorium on long-range missile and nuclear testing, a halt to U.S.-South Korean military exercises, and steps to avoid military incidents along the demilitarized zone that divides North Korea and South Korea.<p>IMO this is the only one that matters in the list. The Pentagon will get another US Based Tech company to aid them in the pursuit of new weapons because there is just too much money to ignore it. Meanwhile peace talks between South and North Korea actually reduces the chance of a thermonuclear war between two nations. We can only hope that other nations follow suit.
I think younger engineers' morality towards the military distinguishes between technology for great-power warfare, and technology for bombing weddings in Yemen. Aside from some real outliers, nobody of any age wants to lose a war with China. However, it's the "wars of choice" that are problematic.
I'm cautiously optimistic that increased AI use in the military is going to be a good thing for everyone.<p>Realistically we're not going to solve the problem of nation states wanting to protect their interests halfway across the world, which'll among other things mean killing some "combatants" from a drone.<p>But we can hope to do things like improve targeting, and a reduction in civilian casualties or collateral casualties. Right now the "AI" is some group of 20-somethings sitting behind a computer in Nevada, what if we trained an AI instead, and could e.g. hold legislative audits on what that software was configured to target?
They're perfectly ok with scanning emails and providing that data to hostile countries and getting someone shot in the face, but if the person is shot in the face with an AI-powered weapon, they're not?<p>Wow the faux ethical reach-around they've giving each other over this is comical.
Beware of the hubris of those surrounding Peter Thiel, whom are war hawks and pro military startups of all sorts... including Palantir which is used to target and eliminate dissidents. And even more troubling are a few of the Ukrainian and Russian "entrepreneurs" whom come to the Valley as being flag-waving capitalists, but it's difficult to ascertain their actual allegiances because the Valley lets people pop up out of nowhere without references, and hands them money and influence.
But yet all 4,000 of them still work for the largest surveillance corporation that has ever existed. At least now they think they have the moral high ground.