In “Phaedrus”, Plato quotes Socrates as having taught, “The parent or inventor of an art is not always the best judge of the utility or inutility of his own inventions to the users of them.” From ubiquitous and intrusive social media to rapidly expanding use of robotics to drive automation, we face the daunting reality of Socrates’s claim. Advances in software development, embedded computing, data communication, machine learning, and distributed data analytics has created an abundance of conveniences that seem sprung from the pages of relatively recent science fiction.
But the pace has perhaps been too quick to give us opportunities to estimate future impacts, and when we do, the discussion tends to become polarized and acrimonious. Take, for example, the two very different views offered by Zuckerberg and Musk: the Facebook founder celebrating promised glories of ever-bigger data-driven intelligence, and the Tesla founder fretting fearfully over the rise of the machines. Most likely, the future will reflect both visions. As shapers of the future, our students must recognize that technology reaches far beyond bytes and gates and that they have an obligation to consider possible negative consequences of their work.
So how can aspiring tech innovators evaluate the impacts of their work so that, ultimately, they can decide whether and under what conditions to pursue it?
It is perhaps useful to turn to the relatively new field of Computer Ethics for guidance. I am certainly no expert on this topic, but I am excited to learn more. As with any exploration of ethical issues, we may quickly find ourselves in the weeds, debating the subtleties of the issues that influence a decision. For those of us just starting out, wrestling with how we might even consider the implications of our technical wizardry, a more direct and less nuanced treatment might provide better guidance.
Fortunately, one does exist. The Computer Ethics Institute offers The Ten Commandments of Computer Ethics, a structurally familiar way to identify behaviors that protect people’s digital assets and those that endanger them. Here they are:
Simple enough, right? Well, not necessarily. When you create a technology, say an app or a new communications protocol or a new way to encrypt data, you have an additional responsibility, one that involves some amount of imagination. While you might not misuse the technology you have created, can others? Can your audience easily use your technology to break any of these edicts? If so, what controls can you create to make it harder for them to do so, even if that eats into your bottom line?
It can be really hard to picture how people might misuse what you create, especially because you likely see only good in it. To help inspire your imagination, consider the Mozilla Foundation’s concept of Internet Health. It describes the state of the network to which we entrust our data in terms of five concepts: Privacy and Security, Openness, Decentralization, Digital Inclusion, and Web Literacy. We can map the ten commandments to these five concepts to demonstrate how they support each one. In turn, we can show that our application obeys those commandments and thus fosters Internet Health.
- Privacy and Security: does your innovation keep people’s data confidential, verifiably true, and accessible at all times and in all places? Commandments 1, 2, 3, and 4
- Openness: does your innovation keep the Internet egalitarian, open to all to contribute and shape? Commandments 1, 2, 3, 5, 8, 9, and 10.
- Decentralization: does your innovation avoid handing control over data or services to a central party, even if you would serve as that central party, ensuring instead that people can access data and services through multiple means? Commandments 6 and 7.
- Digital Inclusion: does your innovation expand what people can do through online means, offering them additional opportunities to learn, share, and enjoy, without discriminating against certain people or groups? Commandments 1, 4, 5, and 9.
- Web Literacy: does your innovation help people gain insights and skills that could help them improve and extend the services you are providing? Commandments 8, 9, and 10.
Clearly, these considerations transcend the technical ones. It is hard to put these on a blueprint or UML class diagram. Instead, these should be part of the use cases and project requirements we develop when we justify doing the project in the first place. Just as people talk today of a Secure Systems Development Lifecycle to ensure that cybersecurity is baked into the design and development processes from the start, we should employ an Ethical Systems Development Lifecycle that aims to ensure that new technology benefits rather than harms citizens’ online lives. The Ten Commandments of Computer Ethics and how they support a healthy Internet could serve as a start.