I quite enjoy Michael Crichton's "Gell-Mann amnesia effect". It's the strange forgetfulness that happens when you're an expert reading the news. You read some article on a topic in your field, and you are just blown away by how wrong it is. I mean, how could they make such a mess of it? But then you turn around and read another article on a different topic you don't know that much about and assume it's reliable. I watch computer scenes in TV shows and laugh at how ridiculous they are, but medical scenes seem perfectly realistic. It's like there's some powerful disconnect stopping you from making an analogy between your expertise and someone else's.
Crichton's argument is that this is to do with trustworthiness, or that it's something particular to the media. I think it's more that we don't see the parallels between the depth in things we know about and the things we don't. When you hear someone who isn't an expert trying to explain something you know a lot about, you can probably guess they're going to get it wrong before they even open their mouths. There's just a lot of complexity to it that's easy to miss, and you know because you spent time and effort mastering that complexity. Someone else's field, though, doesn't evoke that same sense. A simple explanation of something that you don't know is easy to believe, because it's easy to believe that something you don't know is simple.
A little while ago I experienced this in a big way when talking to my hairdresser. It turns out he goes to overseas conferences overseas about hair, signs up for hair workshops and sometimes goes out to watch live hairdressing demonstrations. Live hairdressing demonstrations! It seems mindblowing to me that anyone could care about hair that much, or find so much depth in something that, to me, seems so shallow. I mean, you take the hair, you cut the hair. What's the big deal? And yet I'm sure anyone who wasn't familiar with the complexity in software development would say the same thing. They'd be totally bamboozled by the amount of thought we put into editor choice or tabs vs spaces, to say nothing of the decisions that are actually important.
So I think the solution to the Gell-Mann effect is to learn to build that analogy and equate the depth you find in your own field with the depth of unfamiliar fields. If you understand that religious studies is as rich and complex a topic as computer security, if you believe that it has the same obscure and trivial disagreements and fundamental paradigm-shaking moments, and if you recognise those same easy traps that look like understanding but are really just easy posturing, it should be a lot harder to fall for the next Dan Brown book.
There's a lovely term I've heard used to describe a pattern of questionable decisions in large companies: strategy tax. Let's say you're part of a team in a big company, working on... I don't know, a video hosting site. You try to make the best product you can, listen to your users, and things are going pretty well. Until one fateful day, someone upstairs hears about your comment system, and they say "hang on, our social product has a comment system, you should integrate with it". Never mind that it sucks, you need to use it anyway, because that's what's best for the company.
One way of thinking about this is that if these were two companies rather than one, both free to splash about in the frothy ocean of free-market capitalism, what would happen? Well, unpopular-shitty-social-network company and popular-video-hosting-site company wouldn't really have any reason to make a deal. Maybe popular-video-hosting-site would even shop around and find out that actually-popular-social-network has a much better deal to offer in exchange for integrating with them. But this is the real world, and you're not two companies, you're one company. So you can't make that good deal. In fact, you can't even make the medium deal of just sticking with your regular comments. You have to accept the worst deal, even though it costs you. And that cost is the strategy tax, the price of having a strategy.
Although it's a catchy name, I'm not sure tax is actually the best word here. In economic terms, you could call it a kind of deadweight or efficiency loss. Alternatively, you could interpret stategy tax as meaning strategy tariff, an artificial penalty that a non-strategic option would have to exceed to be used over your strategic one. If you think of this idea in terms of trade policy, it bears a lot of resemblance to the debate around free trade: if your homegrown product sucks compared to the external competition, do you prop it up or just let it fail? In that sense, perhaps the best term is strategy mercantilism.
Although it's usually used to talk about big companies, for a smaller group or even for an individual this idea can still be useful. If you institute some process, you are artificially distorting the decision market. If you say that all help requests must lead to an internal improvement, you remove the option for your employees to make that decision based on its merits. I've occasionally tried to add extra structure to what I write, with things like experiments, retrospectives, or ongoing series, and my experience is always that it makes things much harder. There's something that would be the easiest or best to write at the time, but that's not the decision I go with because I have my thumb on the decisionmaking scale.
Of course, strategy isn't always bad, often it's very good, but I'm making the argument that it is always inefficient. Any time you make a large-scale decision that requires overriding your small-scale decisions, you're throwing away the value that those small-scale decisions could generate. Strategic decisions do provide you with value, and that value might outweigh the inefficiency, but it's an inefficiency nonetheless.
Some external party who can figure out how to achieve the same ends with less strategic losses will be able to do what you're doing more efficiently, and since they're not restricted by any kind of mercantilism they may just come and eat your lunch.