Dead Sheepdog: Ethical Dilemmas in Software Development

Ethics is the consideration of what one ought to do in a given circumstance within a context of society’s and one’s own values.

Ethics within a given discipline takes into account considerations specific to that discipline and values shared among trained and experienced practitioners of good will within that discipline.

“(Ethics) refers to moral values that are sound, actions that are morally required (right) or morally permissible (all right), policies and laws that are desirable.”

Ethical considerations are often hard to navigate. Not just because of the consequences of certain actions but because of ambiguity and dissensus over the given situation and anticipated outcomes.

In this, ethics shares qualities with software development itself.

As Fred Brooks describes, software development is suffused with essential complexity. Attempts to “abstract away its complexity often abstract away its essence.”

Martin, M. and Schinzinger, R., Ethics in Engineering, McGraw Hill, Boston, 2005, pg 8.

“No Silver Bullet – Essence and Accident in Software Engineering”, Brooks, F. P., Proceedings of the IFIP Tenth World Computing Conference, pp. 1069-1076, 1986.

Copyright ©, Ken H. Judy 2008, All rights reserved.

Haidt and Joseph identify five psychological systems, each with its own evolutionary history, that gives rise to moral intuitions across cultures.

Haidt and Joseph’s Five Psychological Systems

Foundation Origin/Expression
Harm/Care People have a sensitivity to cruelty and harm, they feel approval toward those who prevent or relieve harm, and this approval is culturally codified in virtues such as kindness and compassion, and in corresponding vices such as cruelty and aggression.
Fairness/Reciprocity The long history of alliance formation and cooperation among unrelated individuals in many primate species has led to the evolution of a suite of emotions that motivate reciprocal altruism, including anger, guilt, and gratitude (Trivers, 1971).
Ingroup/loyalty People value their ingroups, they also value those who sacrifice for the ingroup, and despise those who betray or fail to come to the aid of the ingroup, particularly in times of conflict. Most cultures therefore have constructed virtues such as loyalty, patriotism, and heroism
Authority/respect People often feel respect, awe, and admiration toward legitimate authorities, and many cultures have constructed virtues related to good leadership, which is often thought to involve magnanimity, fatherliness, and wisdom. Bad leaders are despotic, exploitative, or inept. Conversely, many societies value virtues related to subordination: respect, duty, and obedience.
Purity/sanctity In most human societies disgust has become a social emotion as well, attached at a minimum to those whose appearance or occupation makes people feel queasy. In many cultures, disgust goes beyond such contaminant-related issues and supports a set of virtues and vices linked to bodily activities in general, and religious activities in particular.

These five psychological systems operate at both a conscious and subconscious level. What we feel to be a rationally derived moral judgement may be an instinctive reaction justified by a reasoned argument.

One study examined people’s responses to two dilemmas: A runaway trolley is headed for five people who will be killed if it continues on its present course. In the first case, the only way to save them is to hit a switch that will turn the trolley onto an alternate set of tracks where it will kill one person but save five. In the second case, the only way to save the five people is to push an innocent stranger off a bridge, onto the tracks below killing him but halting the trolley.

The study reveals that most people would hit the switch but not push the person. Though they believe this difference is based in reason the two situations are morally equivalent. fMRI results demonstrate a strong visceral response to pushing a person to their death. This revulsion colors their reasoned response.

Even after the rational arguments for their decision are logically refuted, people embrace their original responses and remain convinced that there is some rational basis for the difference.

Having established that the psychological underpinnings of moral thought are at some level subconscious and irrational, Hait goes on to demonstrate that though the five psychological systems are in part innate and culturally determined, they also vary among individuals within a society. Different people balance these five systems differently and these differences drive different conclusions about appropriate moral action.

As support for this claim, one study established that within American culture, political liberals place more weight in considerations of harm, care, fairness and reciprocity than ingroup loyalty, authority, respect, purity and sanctity. Political conservatives balance these concerns more equally.

The result of Hait’s work is that we can expect one person’s consideration of what is right to differ from another person of good will even in situations where the consequences are clear. Hence, essential complexity. Reducing ethical decision making to rules abstracts away the essence of what is really going on in people’s psyches.
Hait, J. and Graham J.,”When Morality Opposes Justice: Conservatives Have Moral Intuitions that
Liberals may not Recognize”, Social Justice Research, 2007.

Greene, J.D., Sommerville, R.B., Nystrom, L.E., Darley, J.M., & Cohen, J.D. (2001). An fMRI
investigation of emotional engagement in moral Judgment. Science, Vol. 293, Sept. 14, 2001, 2105-2108.

Hait, J. and Graham J.,”When Morality Opposes Justice: Conservatives Have Moral Intuitions that Liberals may not Recognize”, Social Justice Research, 2007.

Copyright ©, Ken H. Judy 2008, All rights reserved.

Separate from the essential complexity of moral reasoning itself, Walter Maner argues that computer ethics is so complex and unique that it constitutes a distinct field of ethical study, “With no satisfactory non-computer analog.”

Computer ethics has no prior analog because computer systems are uniquely complex, fast, cheap and out of control. If this argument is compelling for the more generalized topic of computer ethics, it is clearly even more true for the subtopic of software which is the most abstract, easily replicated and arbitrarily complex aspect of a computer system.

When faced with an ethical dilemma in software development, we are therefore confronted with a unique riddle within a riddle. Not only is the attempt to place an ethical context on our actions essentially complex, the immediate context and potential outcomes are obscured because the act of developing software — the very nature of a software system — is essentially complex.

Copyright ©, Ken H. Judy 2008, All rights reserved.

“We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value: Individuals and interactions over processes and tools. Working software over comprehensive documentation. Customer collaboration over contract negotiation. Responding to change over following a plan.”

Beck, K., Beedle, M., et, al.,. Principles Behind the Agile Manifesto. [Online] [Cited: June 6, 2007] http:// agilemanifesto.org/principles.html.

What the originators of Agile practices held in common was a set of values they jointly published as the Manifesto for Agile Software Development. The manifesto consists of a preamble and a list of twelve principles.

“At the core, I believe Agile Methodologists are really about !mushy” stuff about delivering good products to customers by operating in an environment that does more than talk about !people as our most important asset” but actually !acts” as if people were the most important, and lose the word !asset”. So in the final analysis, the meteoric rise of interest in and sometimes tremendous criticism of Agile Methodologies is about the mushy stuff of values and culture.” — Jim Highsmith

Highsmith, J., History: The Agile Manifesto. [Online] [Cited: June 6, 2007] http://agilemanifesto.org/ history.html.

Agile principles form an incomplete but meaningful set of ethical guidelines. They address concerns for how we ought to behave towards our customers and employers. They also implicitly speak to ethical behavior towards coworkers, and concern for the reputation of our field. They value personal integrity, discipline, courage, accountability and honesty.

Agile principles do not directly address all the concerns and ethical dilemmas described in ethical codes of conduct. For example, they do not speak to an obligation to take responsibility for the actions of our peers within our industry and outside our immediate organization. They also do not speak to the consequences of our actions for society and the larger world.

Agile practices do however, provide a way to parse and address essential complexity. They do this by demanding quality throughout the lifecycle, delivering meaningful, testable, working code in short iterations, and providing mechanisms for continuous improvement based on regular cycles of retrospection. They define roles and responsibilities that empower individuals and hold them accountable.

The agile community itself provides a resource of active practitioners with some degree of shared values who actively seek support and advice from each other through conferences, lists and local gatherings.

If the agile community would open up consideration of value, risk and consequences to encompass the larger world of stakeholders, make explicit such concerns as informed consent and employ it!s specific practices to navigate complexity, Agile development would provide a toolkit for surfacing, contemplating and addressing ethical dilemmas.

Copyright ©, Ken H. Judy 2008, All rights reserved.

The Agile Principles represent a commitment to delivering business value, software quality, honesty, introspection, continuous improvement, humane work environments, empowered workers and customer collaboration.

Their strength is with regard to behaviors the provide direct value to the customer, a sustainable pace, and team and individual excellence.

Agile Principles are silent on the responsibility of the software developer to the general well-being. This includes both things developer should avoid such as participating in actions that benefit customers and employers but potentially harm disempowered and distant stakeholders. It also omits things developers should do such as volunteering, encouraging fair distribution of computing resources, and being aware of developers standing in law.

Finally, Agile Principles don’t consider software developer!s responsibility for the conduct of software developers outside their immediate collaborative team.

Copyright ©, Ken H. Judy 2008, All rights reserved.

The following case study comes from the field of engineering not software. It also deals with safety critical systems and dire consequences. It is included because it is a famous and well-understood example that provides a clear role model and that despite the stakes, presents dilemmas and pressures similar to those faced by developers on less critical systems.

Baura, G. Engineering Ethics: An Industrial Perspective, Elsevier Academic Press, Burlington VT, 2006

Roger Boisjoly was a Thiokol engineer who found “large arcs of blackened grease” on the solid boosters recovered from successful shuttle launches. He identified a correlation between cold temperatures and leakage of hot gases from the O-Ring seals in the solid boosters.

In January 1986, based on Boisjoly’s analysis and forecasts of cooler temperatures than ever experienced during a shuttle launch, Thiokol recommended the shuttle Challenger not launch.

NASA could not proceed over the contractor!s objections. “Appalled” by Thiokol!s recommendation, NASA held a private caucus with Thiokol management. A senior Thiokol executive was asked to, “take off his engineer hat and put on his management hat.” (Rogers Commission, 1986)

As a result, while still expressing concern, Thiokol withdrew their objection for lack of definitive proof. An age old argument for ignoring risk. By definition, no risk is certain.

Challenger exploded during launch killing all seven aboard. In the aftermath, Boisjoly testified before the shuttle commission which is why we know all this.

As a result of coming forward, Boisjoly experienced such a hostile workplace he was granted sick leave and then extended disability.

In 1988, Boisjoly was awarded the AAAS Scientific Freedom and Responsibility Award. He is considered a role model of ethical action.

The most important thing to learn from his example is that ethical behavior is not about certainty or infallibility.
Despite his expertise, insight and integrity lives were lost. At points he respected the chain of command even though he clearly disagreed with their decisions.

When it became clear he had, against his best efforts, contributed to tragedy, he stepped forward to take responsibility for his actions despite the consequences.

Stripped of these life-ending consequences, the situation Biosjoly faced was common. As an expert, hands on with the technology, he anticipated risks supervising technical staff and managers did not. When challenged to definitively prove his concerns, he could not and he deferred.

At stake on one side was his individual and uncertain concern for the lives of the astronauts. On the other were immense costs of delay, the reputation of the shuttle program, the revenue stream of his employer and differing convictions of other engineers and managers.

A person in this circumstance might weigh what they surmise as the likelihood of a bad outcome against concerns for their livelihood and reputation. They might also consider the possibility that they were wrong and that their managers and other technical staff were right. A person with experience acting with good will might be expected to make the same decisions.
Therefore, while the pressures put upon him and others were unethical in that they did not take due care for the lives of others, his actions in those circumstances were ethical despite the eventual consequences.

An ethical viewpoint doesn!t demand we have perfect insight into the future or that we immolate our careers but that we reasonably consider the implications of our actions on the broad set of stakeholders who might be affected.
However, while acting ethically may have absolved him of legal culpability, it did not resolve him of responsibility. Though he acted reasonably, a different course of action (however extreme) might have saved lives.

To his immense credit, Boisjoly acted from that sense of responsibility. Again, he might have chosen not to come forward leaving others to fix NASA. He could have quietly continued at Thiokol or he could have left for another job. Instead, he testified to the shuttle commission becoming a whistle blower.

In doing this, Boisjoly may have saved the lives of other astronauts. He certainly improved the execution justice and served the interests of the families of the deceased and the American public.

  • Maner, W., “Unique Ethical Problems in Information Technology”, Science and Engineering Ethics, 2:2, April 1996, pp. 137-54.
  • InfoWorld, “Software developer growth slows in North America”, [online] [cited June 6, 2008] http:// www.infoworld.com/article/07/03/13/HNslowsoftdev_1.html
  • U.S. Department of State’s Bureau of International Information Programs, “USA Economy in Brief”, [online] [cited June 6, 2008] http://usinfo.state.gov/products/pubs/economy-in-brief/page3.html
  • SANS, “SANS Top-20 2007 Security Risks (2007 Annual Update)”, [online] [cited November 11, 2007], http://www.sans.org/top20/#c1.
  • NIST, “The Economic Impacts of Inadequate Infrastructure for Software Testing”, [online] [cited November 11, 2007], http://www.nist.gov/public_affairs/releases/n02-10.htm
  • Gartner, “Gartner Estimates ICT Industry Accounts for 2 Percent of Global CO2 Emissions”, [online] [cited November 11, 2007], http://www.gartner.com/it/page.jsp?id=503867
  • Martin, M. and Schinzinger, R., Ethics in Engineering, McGraw Hill, Boston, 2005, pg 8.
  • “No Silver Bullet – Essence and Accident in Software Engineering”, Brooks, F. P., Proceedings of the IFIP Tenth World Computing Conference, pp. 1069-1076, 1986.
  • Hait, J. and Graham J.,”When Morality Opposes Justice: Conservatives Have Moral Intuitions that Liberals may not Recognize”, Social Justice Research, 2007.
  • Greene, J.D., Sommerville, R.B., Nystrom, L.E., Darley, J.M., & Cohen, J.D. (2001). An fMRI investigation of emotional engagement in moral Judgment. Science, Vol. 293, Sept. 14, 2001, 2105-2108.
  • Hait, J. and Graham J.,”When Morality Opposes Justice: Conservatives Have Moral Intuitions that Liberals may not Recognize”, Social Justice Research, 2007.
  • Beck, K., Beedle, M., et, al.,. Principles Behind the Agile Manifesto. [Online] [Cited: June 6, 2007] http:// agilemanifesto.org/principles.html.
  • Highsmith, J., History: The Agile Manifesto. [Online] [Cited: June 6, 2007] http://agilemanifesto.org/ history.html.
  • Baura, G. Engineering Ethics: An Industrial Perspective, Elsevier Academic Press, Burlington VT, 2006.