Building Stethoscopes

I have trouble expressing my thoughts. I don't know the correct words and conversational approaches. I'm unaware of existing literature on this subject. And as a result, I often fall short. I fail to communicate in a useful way and the conversation spirals out of control. One of the problems I've run into several times is the ethics of tools. I believe that all tools contain a quality of ethics. That is to say, tools by nature range from ethical to unethical, regardless of how they are used. This is my attempt to write down why I believe this, and to organize my argument. My hope is that next time it comes up, I'm better prepared and better informed.

Hammers

Strangely enough, I started with the opposite assumption. I thought that a tool was ethically neutral; it's an inanimate object, subject to the human will that motivates it. After all, a tool can make no decisions, and knows nothing of the good or harm it does. A hammer can be used to build, or used to destroy. It's only a tool used to achieve a human end. I reasoned that software is pretty similar. It can benefit or harm, according to the desires of the people using it. The user is empowered, not the dumb thing.

But tools don't exist in a vacuum. They spring piecemeal from the heads of their fathers, and carry with them their father's purpose. Each is created to achieve a specific goal, and their form and function lend themselves to that goal. Every use of a tool contains a conversation between the user and that tool's own inherent purpose. If that tool's purpose is unethical, it becomes difficult to use that tool ethically. If a tool's purpose is ethical, it becomes difficult to use it to work harm.

A hammer is perhaps an ethically neutral tool. It builds and destroys in equal measures. Its function is dominated by its user. The vast majority of daily-use tools fall into this category. They only extend our set of possible choices, expanding our capacity to do good or evil. Nonetheless, there exist many tools that are undeniably unethical.

Landmines

I like to think of them as landmines. Landmines were created to kill people and render areas uninhabitable. And while I am not willing to make the argument that there is no possible ethical use for a landmine, I think everyone should agree that it is extremely difficult to determine what that use should be. There is no clear rule by which we may place a landmine ethically; the consequences of a lapse of judgment are opaque and terrible. Most, if not all, landmines used in history have been used unethically. The only unequivocally ethical choice a landmine's user can make is to disarm and disassemble the mine. To unmake it.

A landmine makes it extremely difficult to make good decisions. Destruction itself is not unethical when carefully measured, but wanton unthinking destruction certainly is. The form and function of a landmine and its internal purpose lend themselves to unethical decisions. They make it easy for a user to destroy without thought, and do not provide any safeguards against it. They encourage unethical thoughts and actions merely by existing. There are any number of tools like this, ranging from weapons of war to media to governance structures.

These landmines are unethical because they expand the users ability to make unethical choices, without a corresponding and proportional increase in the user's ability to make ethical choices. They give us many poor options, and very few decent ones. These tools don't empower us, but rather limit us. They limit our ability to be good and ethical people and members of society. This limitation is endowed by the tool's creator, and the tool propagates its limits to its users.

Stethoscopes

Conversely, there are tools that are inherently ethical. They expand our ethical choices, and limit our unethical choices. A stethoscope, for example, enables healing far more effectively than it permits harm. Stethoscopes were designed to heal, to work good in the world, and that purpose improves its users. That's not to say that a stethoscope can't be used to harm, but rather that a stethoscope must be coerced to that purpose.

We build tools to improve our lives, and they have, and they will continue to do so. But it is important to me personally that we build stethoscopes instead of landmines. Or if not stethoscopes, at least hammers. As we create new tools we open up new landscapes of available decisions. We explore areas strange and unknown, and we understand them only poorly. Time and resources limit our ability to chart and civilize these wilds. I'd rather we not waste them heading down dark paths. We should need nothing that's there.

So how do we determine ahead of time what's a hammer, what's a landmine, and what's a stethoscope? How do we decide what we should build? I really don't know. I don't think it's a fully solvable problem. Again, I'm unwilling to say that there are universal ethical truths. But I believe firmly that there are better systems out there. Patterns of thought that limit the harm our tools can do to us. Methods to evaluate the impact of a tool as we build it. And I believe that these considerations must be engrained into the design and creation of new tools.


Addendum for developers:

We say that software is free or not based on how it impacts the freedoms of its users. Free software was defined 30 years ago, and though the words have been updated several times, the core remains: free software does not limit the freedoms of its users. Applying that same principle, we can say that ethical software does not limit the ethics of its users. The software is designed with a purpose in mind, and that purpose permeates its operations. If that purpose is unethical, it becomes proportionally more difficult to use that software for ethical purposes, which limits the ability of its users to be act ethically.

In this case users can, and should, be construed to include operators and maintainers. It is possible to use always-on mobile geolocation ethically, but easier to use it unethically. That software naturally lends itself to invasions of privacy. Much more work must be done to create an ethical version of this system, and thus its users' and operators' abilities to make ethical decisions are limited by the software itself.

In this way we can look at specific applications, and make value judgments about their ethics. So far we do it by our guts, and as a result we've done a generally poor job of enforcing our ethical values on software. We've let human rights travesties like PRISM slip by. I believe there must be a litmus test, similar to free software's four freedoms, to help us determine whether software is ethical. And I would like to figure out what that test looks like. I'd love any input.