Volkswagen and an Ethics Lesson for Software Developers

vwWhen it comes to corporate scandals, Volkswagen has pulled off a doozy. To reduce its emissions numbers without sacrificing performance, Volkswagen equipped its diesel-powered cars with software that could detect when the vehicle’s emissions system was being tested so that it could secretly switch to low-emissions mode to pass the emissions test. When the test was over, the software would switch the car back to normal mode. The problem with being in normal mode, however, is that the car would then exhaust 40 times the legal limit of pollutants into the air. So, the unsuspecting driver could write on the ozone in permanent marker while still reveling in the illusion of being green. That’s an audacious level of subterfuge, boldly perpetrated just so that its drivers could experience a little old-fashioned Fahrvergnügen.

Grauenhaft!

You’ll notice that software served as the linchpin for Volkswagen’s fraud. When they integrated Bosch-made pollution controls into the car’s drivetrain, Volkswagen apparently included software that would detect engine load patterns indicative of an emissions test and divert engine output from propelling the wheels to powering the pollution controls. When the software detected that those conditions no longer applied, it diverted engine power back to accelerating the car. Simply, software enabled their treachery.

Software is inherently cloak-and-dagger. I don’t see the commands that enable the browser I’m currently using to write this article; I just use the browser as a tool. We don’t see the instructions that power any of the applications we use everyday. We simply consume their services, and only the software development teams get to see the instructions that make those services happen. And sometimes, the software is so clandestine that we wouldn’t even begin to suspect its presence. Why would we think to look for software installed in the pollution control systems of Volkswagens to falsify their test readings? Volkswagen banked that we wouldn’t; for six years, they were right.

Software developers face issues like this every day. If you’re working on a program that has a couple million lines of code or more, what are the chances that someone is going to find that one line of code that diverts a penny of every transaction to your own private bank account? If I’m writing software embedded in a wifi router, how long will it take for someone to realize that I’m capturing people’s passwords before I encrypt them and then sending the passwords I’ve seized to a government entity, or to someone else who will use those credentials to steal users’ identities? Or, for a more pedestrian example, if I’m up against a deadline, and I write code that I know has a few potential bugs in it but I don’t have time to fix it, and so I ship it out to customers anyway, can I just treat the bug as something to be fixed in the next version, for which the customer will have to pay? We do need to keep the lights on, after all, and so software bugs can actually become a convenient little revenue stream! And, finally, if I’m employed by Volkswagen, and the boss asks me to write the software that will fool eager customers into thinking they’re driving a car that spews pixie dust out of its tailpipe instead of carbon monoxide, do I comply, do I leave, or do I satisfy my inner Snowden and blow the whistle?

Most of the software we write provides useful services. In fact, the cloud-computing phrase “software-as-a-service” has always struck me as terribly redundant, because software, by its very nature, serves its users. But those services can be very easily manipulated to achieve other aims. At every stage of the process, and seemingly with every line of code he writes, a software developer has the opportunity to do wrong. Thankfully, most choose not to seize that opportunity, but I suspect that rectitude has nothing to do with how colleges are preparing them. Sadly, very little of a software developer’s preparation is spent discussing ethics and professional responsibility. In reminding us of the great influence software developers have and the ease with which they can abuse that influence, Volkswagen’s offenses suggest that Computer Science educators may need to rethink how ethics fits into the curriculum.

 

About Ray Klump

Associate Dean, College of Aviation, Science, and Technology at Lewis University Director, Master of Science in Information Security Lewis University http://online.lewisu.edu/ms-information-security.asp, http://online.lewisu.edu/resource/engineering-technology/articles.asp, http://cs.lewisu.edu. You can find him on Google+.

Leave a Reply

Your email address will not be published. Required fields are marked *