Sam Gentle.com

Anonymity and democracy

troll masquerade mask

It's well known that many online communities start to get bad as they grow. There are a lot of theories on why this is, some of my favourites are Gabe's Greater Internet Fuckwad Theory and evaporative cooling. I think there is also an underexamined root cause which is also the main assumption underlying democracy. The assumption is that people have value.

I should be clear: I think people have value. It's a good assumption for democracy, but because democracy is the social system we're most familiar with we tend to carry those assumptions over into the design of other systems. Large online communities struggle to deal with the weight of non-valuable people: trolls, spammers and other undesirables. And you can never really get rid of them. Anyone you ban can make a new account. And even if you could could get rid of them permanently there would be some new troll basically identical to the last.

Indeed, maybe the point is that there's no difference; a new user is a new identity, and one of the wonderful things about the internet is not being limited to one fixed identity that's often assigned by others. In the future I think we'll reach a point where it's common to have many identities for different communities and different purposes. We'll swap identities as quickly as a change of clothes, adopting new ones when the old ones don't suit, for playing a particular role or just for fun. As technology improves there will even be autonomous identities that can, themselves, spawn more autonomous identities.

How can we possibly create a social system to handle that? The Facebooks and Googles of the internet are obsessed with keeping our online identity tied to our real-world identity because they know that's the only way to make the people-have-value assumption hold true online. But what if we just abandon it? I think we could build richer systems on the assumption that identities are not valuable. Instead, they have to earn value.

Many communities already do this to some extent. On Hacker News, certain features unlock as you gain more karma. On Reddit, repeated successful posts make you less likely to be caught by the spam filter. But you could go further and make a community where posting is not allowed until you demonstrate value in some other way, like voting on other new posts. Google's Pagerank is another interesting example. It assumes new websites have basically no value. Instead, they gain value by being referenced by other sites. You could have a similar social system that passes value by endorsements.

There are also other solutions, usually suggested to fight spam, where new identities or actions (like sending an email) have a monetary cost, thus proving a certain degree of real-world value. While that's definitely better than being tied to a particular real-world identity, I think money is not a great analogue for value in many communities. A determined evildoer can afford to just trade money for mischief. A better way would be to allow users to earn value within the community itself. Why bother to make an identity to do harm if you first have to do at least as much good?

Even if this is already happening in some ways, it's important to recognise the trade-off in question: you can't have identities be free and inherently valuable at the same time. The true power of virtual identity has yet to really come into its own, and I don't think it will until we are willing to sacrifice that inherent value.