TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Killing Robots and Just War Theory

27 点作者 behindai大约 3 年前

8 条评论

torstenvl大约 3 年前
I&#x27;m not sure what the HN policy is when the original title is heavily editorialized, but I flagged this submission for editorialization and for politics.<p>The linked post does not discuss Just War Theory at all. It discusses political positions of certain thought leaders as pertain to IHL. There&#x27;s nothing wrong with that per se, except (a) if you&#x27;re quoting Jonathan Parry more than Thomas Aquinas, what you&#x27;re discussing isn&#x27;t Just War Theory, even if it is desperately trying to claim that philosophical heritage; and (b) it isn&#x27;t a &quot;theory&quot; at all because, within a secular framework, there is neither a set of foundational premises accepted universally as truth nor any testability. Just War Theory is only a &quot;theory&quot; <i>within</i> a Catholic epistemological framework. If your epistemological framework does not include a deity or the possibility of one day conversing with that deity, then it&#x27;s meaningless to talk about your interpretation of their rules as being true or false.
评论 #30988070 未加载
评论 #30990146 未加载
michaelt大约 3 年前
I once had a robotics professor who argued cruise missiles meet most definitions of &#x27;robot&#x27;. Meaning, of course, that we&#x27;ve had killer robots for decades.<p>(I suspect his larger point was that, if you try to come up with a definition which includes an industrial robot arm, a bomb disposal robot, and a robot vacuum cleaner, those things don&#x27;t actually have much in common - so your definition will sweep up a bunch of things that most people wouldn&#x27;t call robots)
评论 #30987794 未加载
yboris大约 3 年前
An aside about &quot;Just War Theory&quot; (JWT). I took a semester course in philosophy with Jeff McMahan (who has been working in this area for a long time). Having a theory (and a thus set of rules most countries try to follow) is better than nothing, but the current theory seems absurd. It exculpates soldiers from participating in war - it claims they are not doing anything wrong when they follow orders to kill other combatants.<p>The simplest example of a justified war is one of defense (like what Ukraine is engaging in). But under JWT, even though Russia is not justified in its actions, the individual soldiers are morally permitted to follow orders to kill combatants. Thankfully, at least, under JWT soldiers are not justified in following orders to kill noncombatants.
评论 #30988104 未加载
评论 #30988159 未加载
评论 #30988190 未加载
binarymax大约 3 年前
This treatise only focuses on one point of view from two studies, and asks if killer robots are OK.<p>But there are plenty of other philosophies and ethical&#x2F;logical paths one can take.<p>In particular the article doesn’t touch upon nonviolent resistance, which one can argue is more ethical than self defense. I also don’t agree that you can reduce war to a series of individuals protecting their right not to be killed and therefore war at scale is justified in a general sense. I do appreciate the mention of information asymmetry for conscription as a flaw in ethical standing by those who would wage war, but this is obvious since that’s how pretty much every war has ever been started.<p>Machines designed to kill people don’t have any ethical escape route. They are by definition evil constructs. The creation and use of an evil construct is not justified in any ethical argument.
kkfx大约 3 年前
The actual trend toward robotization have nothing about moral, humanitarian or legal reasons but more about how to ensure obeisance: actual government derive all toward dictatorship and all dictatorship knows that a certain point in time people will revolt, robots will not.<p>In the past there were no means to minimize the needs of loyal humans, workers are humans, soldiers are humans etc, now automation reach a sufficient point to automate *some* aspects, on one side might be sold in moral, humanitarian, legal terms etc, in some aspects might also really be, but the real target is just the old classic race toward absolute power.
评论 #31003204 未加载
lordnacho大约 3 年前
I&#x27;m no expert on ethics but what I&#x27;ve read about just war is unconvincing. It&#x27;s just not clear to me what a reasonable justification is. Say it&#x27;s if you&#x27;re being attacked...<p>I guess we&#x27;d all agree if your neighbor is driving armor over the border, you&#x27;re being attacked.<p>But can you attack them before they get the border? Surely yes, why would you wait until they have the greatest advantage?<p>What about when the order is given? Surely yes, why wait two weeks for them to organize themselves? Hit them while they&#x27;re getting ready to kill you.<p>What about while they&#x27;re deciding? Of course, thinking about attacking is threatening too. Get them while they&#x27;re deciding, that way you minimize the amount of harm you have to cause to them.<p>What about if they&#x27;re simply driving their armor around, building up just in case you attack them? That seems fair, doesn&#x27;t it? But why should you let them threaten you into giving any concessions?<p>It just never really seems like it ends. There&#x27;s always some wrinkle where you can say &quot;but he started it&quot; and it&#x27;s not actually all that clear.<p>By the same logic, I don&#x27;t see how you can really say your war is just, just because you follow certain rules. &quot;We won&#x27;t kill any civilians&quot; seems fair, but like it says in the article, there are no civilians. They&#x27;re either driving a tank, or they built the tank, or they clean the tank, or they bought the tank, or they trained the driver, or they say nice things to the tank driver.<p>Likewise &quot;we won&#x27;t use WMD&#x2F;cluster bombs&#x2F;rape&#x2F;genocide&quot;. All those things will happen if things get desperate enough. I mean sure, if you can roll into their capital and everybody thinks it&#x27;s great, we won&#x27;t have to pour agent orange all over. But if it&#x27;s not going our way, we will lower our standards. First by saying things that aren&#x27;t nice, then by doing things that aren&#x27;t nice. This is why international conflicts are not simply decided by a tug-of-war or a game of football. We can&#x27;t accept the result thus far, so we will do something not nice, anything between sending diplomats home to ICBMs, until we get our way. The fact is there are no rules, we only say there were rules and that the loser broke them. If we&#x27;d lost, they&#x27;d say we did something wrong.<p>&#x2F;rant
评论 #30987827 未加载
评论 #30988248 未加载
therobot24大约 3 年前
this site is much more usable when no-script blocks any javascript
pdfernhout大约 3 年前
The &quot;Bulletin of the Atomic Scientists&quot; website has not seen fit to post yet a comment I made there a few days ago on &quot;Russia may have used a killer robot in Ukraine. Now what?&quot;: <a href="https:&#x2F;&#x2F;thebulletin.org&#x2F;2022&#x2F;03&#x2F;russia-may-have-used-a-killer-robot-in-ukraine-now-what&#x2F;#comment-40002" rel="nofollow">https:&#x2F;&#x2F;thebulletin.org&#x2F;2022&#x2F;03&#x2F;russia-may-have-used-a-kille...</a><p>So I will put a copy of what I posted there here on HN, since it is relevant to the bigger picture of killer robots:<p>I&#x27;ve thought about how to mitigate the risks of (potentially self-replicating) autonomous military robots and other potential WMDs since the 1980s when I was a visitor at the CMU Robotics labs of Hans Moravec and Red Whittaker, and also from taking classes with Frank von Hippel, Steve Slaby, George Miller, Gerry O&#x27;Neill, and Jim Beniger at Princeton, and from hanging out a few times with Freeman Dyson at the IAS and also talking with Ted Taylor (who Freeman Dyson put me in touch with) -- as well as talking to many other people and reading widely on this and related issues (both non-fiction and also insightful fiction, like James P. Hogan&#x27;s Voyage from Yesteryear). My thoughts on this are summarized in an essay I wrote in 2010 on my website &quot;Recognizing irony is key to transcending militarism&quot;.<p>Essentially, political controls will likely ultimately fail in the long-term given that parties to the (unlikely-in-any-case) political agreements have big incentives to cheat as long as they view the world in zero-sum competitive terms. As Freeman Dyson has explained in his book &quot;Weapons and Hope&quot;, as long as the capacity to build WMDs remains they will always be a potential issue regardless of what treaties exist. The capacity to build such things eventually will always exist in an advanced technosphere. Right now it is relatively easy to build bioweapons and drones, and much harder to build nukes (which potentially might get easier with new advances in physics and material science). Frankly, we are doomed if we depend on political treaties alone to deal with this. What might succeed is ultimately a change of heart to A Newer Way of Thinking (like Albert Einstein and others like Donald Pet have talked about). Such a change of heart will involve emphasizing mutual security, intrinsic security, and achieving abundance from all through wise use of advanced technology. It will be a cultural change that makes the biggest difference, where in the long-term various political agreements might at most just reflect a global change of heart and perspective.<p>On a practical basis, tools that reflect a newer way of collaborative thinking like Dialogue Mapping with IBIS (Jeff Conklin) and Convergent Facilitation (Miki Kashtan) may help groups a lot in thinking of better solutions that address the needs of everyone. Better Free and Open Source Intelligence tools widely deployed may also help everyone make better decisions and avoid disasters like Ukraine.<p>==== More Details<p>To that end of a change of heart and perspective towards a newer way of thinking, here are the main points from that essay:<p>Military robots like drones are ironic because they are created essentially to force humans to work like robots in an industrialized social order. Why not just create industrial robots to do the work instead?<p>Nuclear weapons are ironic because they are about using space age systems to fight over oil and land. Why not just use advanced materials as found in nuclear missiles to make renewable energy sources (like windmills or solar panels) to replace oil, or why not use rocketry to move into space by building space habitats for more land?<p>Biological weapons like genetically-engineered plagues are ironic because they are about using advanced life-altering biotechnology to fight over which old-fashioned humans get to occupy the planet. Why not just use advanced biotech to let people pick their skin color, or to create living arkologies and agricultural abundance for everyone everywhere?<p>These militaristic socio-economic ironies would be hilarious if they were not so deadly serious. ...<p>Likewise, even United States three-letter agencies like the NSA and the CIA, as well as their foreign counterparts, are becoming ironic institutions in many ways. Despite probably having more computing power per square foot than any other place in the world, they seem not to have thought much about the implications of all that computer power and organized information to transform the world into a place of abundance for all. Cheap computing makes possible just about cheap everything else, as does the ability to make better designs through shared computing. ...<p>There is a fundamental mismatch between 21st century reality and 20th century security thinking. Those &quot;security&quot; agencies are using those tools of abundance, cooperation, and sharing mainly from a mindset of scarcity, competition, and secrecy. Given the power of 21st century technology as an amplifier (including as weapons of mass destruction), a scarcity-based approach to using such technology ultimately is just making us all insecure. Such powerful technologies of abundance, designed, organized, and used from a mindset of scarcity could well ironically doom us all whether through military robots, nukes, plagues, propaganda, or whatever else... Or alternatively, as Bucky Fuller and others have suggested, we could use such technologies to build a world that is abundant and secure for all. ...<p>The big problem is that all these new war machines and the surrounding infrastructure are created with the tools of abundance. The irony is that these tools of abundance are being wielded by people still obsessed with fighting over scarcity. So, the scarcity-based political mindset driving the military uses the technologies of abundance to create artificial scarcity. That is a tremendously deep irony that remains so far unappreciated by the mainstream.<p>We the people need to redefine security in a sustainable and resilient way. Much current US military doctrine is based around unilateral security (&quot;I&#x27;m safe because you are nervous&quot;) and extrinsic security (&quot;I&#x27;m safe despite long supply lines because I have a bunch of soldiers to defend them&quot;), which both lead to expensive arms races. We need as a society to move to other paradigms like Morton Deutsch&#x27;s mutual security (&quot;We&#x27;re all looking out for each other&#x27;s safety&quot;) ... and Amory Lovin&#x27;s intrinsic security (&quot;Our redundant decentralized local systems can take a lot of pounding whether from storm, earthquake, or bombs and would still would keep working&quot;). ...<p>Still, we must accept that there is nothing wrong with wanting some security. The issue is how we go about it in a non-ironic way that works for everyone. ...<p>--Paul Fernhout &quot;The biggest challenge of the 21st century is the irony of technologies of abundance in the hands of those still thinking in terms of scarcity.&quot;
评论 #31040707 未加载