Yes, and

There's a fun improv theatre game called "yes, and", designed to teach people the right way to influence a scene without ruining it. It's tempting, when you start, to try to control a scene; your partner says "let's go to space", but you had a great idea for a medical scene, so you say "no way, space is too cold, let's stay here in the hospital" and now you've ruined their idea. Worse still, maybe they respond with "but space is great" and now you're having a boring argument on stage.

Instead, you want to build on what is there already. They say "let's go to space" and you respond "yes, and quickly, doctor, because those astronauts will die without our help!" Thanks to both of your ideas, you now have an interesting scene about space doctors that both of you can contribute to. You can't control the scene, it's more like you guide it in a direction you want it to go; always forward, never backwards; always adding, never taking away.

You do this to make a good scene, but it's worth wondering why that is a quality of a good scene. Do our preferences have something more universal to say about adding vs removing? I wrote before about how it's hard to remove an association, and also the way that exhaustive knowledge is required to disprove anything. I think these effects combine to make negatives unpalatable, whether or not they are useful. Not just for people, but as a general rule. Nature abhors a negative.

Another interesting example of this is distributed systems. It's very easy to write a gossip protocol where peers just spam facts at each other and merge in the facts they receive. In that sense, you can say that no peer has incorrect information, they just don't have all the correct information yet. Eventually, as the system converges, correct information will spread to every peer, and peace and correctness will reign throughout your distributed system.

But that's only true when your information is additive, when it contributes more facts rather than taking some away. The weather at 9am was sunny. Yes, and the weather at 9:30 was cloudy. But how do you delete information? How do you say "wait, no, the weather at 9am was actually rainy"? Your system is no longer correct vs more correct, it's correct vs incorrect, and that's much harder to deal with. Whether it's un-committing something from Git, actually deleting a document from CouchDB, or taking information off the internet, distributed deletion isn't like distributed addition. It's hard.

And the most gossipy distributed system of all is, of course, people. We copy ideas like crazy from each other, constantly broadcasting and absorbing everything we learn. But what about something we un-learn? How often do you go around telling people "wow! I just realised I'm wrong about something"? Probably never, or at least very rarely. Un-learning isn't fun, negative information isn't interesting, and telling someone they're wrong is the absolute opposite of "yes, and".

Which brings us back to why we don't like negatives, and why "no, but" isn't a fun improv game. I believe this falls from our brains' own structure of storing information being fundamentally additive. We can't unassociate, so we have to associate a negative. We can't easily forget, so we just learn to feel bad about remembering. Our mechanism for forgetting is thus essentially painful. The same mechanism that teaches us to recoil from a hot iron teaches us to recoil from a bad memory. It hurts to be wrong.

This is why atheism, skepticism, and other negative movements struggle to gain acceptance. You can't tell a religious person there's no God and have that mean anything, because the closest additive analogue is "keep your existing beliefs and also feel bad about them". And that's why the modern fight against post-truth is so hard, because it's always easier to add falsehoods than fight them, it's always easier to "yes, and Obama is a secret Kenyan poisoning us with autism vaccines" than it is to "no, nothing about what you have said is remotely true in any way".

What's the solution? Well, if you want to change minds perhaps the best way is to add conflicting information. Instead of saying "there's no God", you say "here's some compelling ideas that will eventually come into conflict with a belief in God". The rationality movement is, in a sense, an attempt to do this. This is a long game that relies on setting up an eventual paradox that you hope will be resolved in your favour.

But it's worth wondering if you always need to change minds. If your issue isn't with religion, but with current religious practice, it would be far easier to replace "no, but the universe is interesting anyway" with "yes, and God is in all things, and all our religions are imperfect attempts to understand that true God, who is the universe itself". Perhaps ideologically atheism is preferable to pantheism, but from a utilitarian perspective it seems pretty clear that a world full of pantheists would be pretty similar and probably easier to achieve.

So for long-term convincing, I believe adding conflicting information is the best way to achieve an eventual subtraction of ideas. But for important things on a short timeline, perhaps it's best to swallow your pride and find a way to say "yes, and" to bad beliefs, and thereby gain some measure of influence over them.