Sam Gentle.com

Surgeons, pilots and code monkeys

I learned an interesting thing recently, which is that surgeons don't do surgeries that they don't want to. I mean, obviously you can choose to just not do your job at any job, but surgeons don't get fired for refusing to do a surgery. In fact, it's an important bit of dialogue between physicians and surgeons: what would it take for you to be willing to do the surgery? Nobody orders a surgeon around because, when it comes down to it, they're the ones who put the knife in, and whatever happens afterwards is on their conscience. Nobody else can take that burden, so nobody else can tell them what to choose.

The same is true of pilots. A pilot is considered in command of the plane; they are directly responsible for every life on board. A pilot's decision trumps anyone and everyone else's decision. Air Traffic Control can say "don't land yet" and the pilot can say "it's my plane and I'm landing, figure it out". Doing that without a good reason is likely to lose you your pilot's license. However, it's not only acceptable but obligatory if the situation merits it. As a pilot, those are your lives in the back of the plane, and nobody else can absolve you for what happens to them.

But software does not have this same sense of sacred responsibility. More often the conversation looks like developers saying "we shouldn't do it this way", the management or client saying "well we want you to do it that way", and the developers saying "okay, your funeral". Usually that is a figurative rather than literal funeral, and just means losing money or time. But there are famous examples of the other kind too. As a developer, can you really say you are not responsible for the bad decisions you accept? Are you not wielding the knife or holding the controls?

The current state of the art says no, developers are not like pilots or surgeons. The responsibility for bad decisions lies with management, and you can feel safe in the knowledge that someone else is liable for the bad code that results. Perhaps this makes sense in the classical programmer-as-overpaid-typist environment, where your job was not to think but to turn someone else's thoughts into code. How can you be responsible if you are just one of a hundred thousand code monkeys banging away at big blue's infinite typewriter farm?

But modern software development is not like that. Developers are expected to be autonomous, to understand requirements, to plan, to design, make decisions, build, test, rebuild, deploy and demonstrate. Today's developers are more like pilots or surgeons than anyone cares to admit. They have particular professional knowledge and skills that nobody else has, which gives their decisions a moral weight – a responsibility. If that professional knowledge says "this decision is a bad decision", that developer is every bit as obligated to stand up for their profession and refuse to do the work.

Perhaps that seems overdramatic, but software is growing faster and doing more than any industry in the last century. It's hard to even find something that can't be ruined by bad software. The software in your batteries can burn down your house. The software in your smoke alarm can turn your life into a dystopian horror film. The software in your phone can monitor every sound and movement you make. The software in your car can stop your brakes from working. The software in the cloud can leak your naked photos, arbitrarily remove your data or lock you out of it, and reveal your personal information to repressive governments.

The question isn't whether the people who make these things should be considered as professionally responsible as a pilot or surgeon. The question is: how can you even sleep at night knowing that they aren't?