There's also the ACM Software Engineering Code of Ethics and Professional Practice: <a href="http://www.acm.org/about/se-code" rel="nofollow">http://www.acm.org/about/se-code</a>
I would change the push from ethical software to ethical development.<p>Software, like so many drugs is not "unethical" on its own, but the user (or doctor compared to medicine) is the one that makes an unethical use of it (or medicine)<p>I agree that in our society where software is pervasive, an ethical framework needs to be in place. As an oath, it will not force everybody to comply, but the choice will be based on full knowledge that it is an unethical choice. And that amounts to intentionality<p>Can I advocate that software developers should be oath to write software that do not harm? The ideal would be that the law should protect software developers that refuse a particular assignment on ethical grounds.
"I swear to respect the privacy of the user and secure all personal information in accordance with current standards."<p>(Except for Hitler's private communications.)
I'snt software amoral?<p>The same routine that helps target a missile from a drone, could help target viruses more effectively, but someone has to make the choice to use it for that purpose.
Privacy and security are very important, but if one is going to write a code of ethics for software, it should probably include something about not writing software that kills people (see, for example, Therac-25).
This sort of thing has been tried before:<p>JSON.org License Literally Says it "shall be used for Good, not Evil": <a href="https://news.ycombinator.com/item?id=3693108" rel="nofollow">https://news.ycombinator.com/item?id=3693108</a><p>But, there is always the problem of defining evil: <a href="http://www.youtube.com/watch?v=JRxl02mULws" rel="nofollow">http://www.youtube.com/watch?v=JRxl02mULws</a><p>How long before the oath says "Except for IBM, its minions and customers..."?
I like the idea, but it is too black-and-white in some places. <i>Never</i> exploiting security vulnerabilities? There are certainly cases where that is justifiable -- attacking the Enigma machine, for example (speaking of Turing...). If World War II were to happen today, there would almost certainly be attacks on computer systems, and we would want our military to attack the enemy military's systems.<p>There is also the matter of law enforcement. It is better for law enforcement agencies to exploit vulnerabilities (with a warrant) than to require special back doors in software. No reasonable person can take issue with the police exploiting a vulnerability to catch a mafia hit man, a criminal arms dealer, etc. Some hacker needs to be willing to write the software that exploits those vulnerabilities. I would say that writing software with back doors is a much more serious ethical problem than exploiting unintentional vulnerabilities.
How about naming it for Aaron Swartz?<p>As I noted in a commend on github I would not conflate this issue of privacy and ethics as it relates to user information collected by applications with things like war crimes and crimes against humanity.