Attritional interfaces
I've never really liked RSS readers. I've used them on and off for various periods of time, but in the end it always goes the same way: I end up following too much stuff, the "unread" counts pile up into the hundreds or thousands, and I eventually just declare RSS bankruptcy and abandon it entirely until the next go-around. However, in recent years social news sites like Reddit, Twitter and Hacker News have mostly filled the RSS-shaped hole in my life, despite missing a lot of the content I used to go to RSS readers for. Why is this?
My contention is that social news sites are fundamentally attritional, by which I mean they slowly lose data by design. While this would be suicide for office software or a traditional database-backed business application, it actually works very well for social news. Old posts on Reddit fade away under the weight of new ones, and the only way to keep them alive is with constant attention in the form of upvotes and reposts. It's quite common to think of something you saw a few days ago and be unable to find it or remember what it was called. While that that might be frustrating, it's actually Reddit working exactly as intended.
The trick is that most software is designed to complement us. Where we are forgetful, computers remember. Where we are haphazard, computers are systematic. Where we are fuzzy, computers are precise. This makes them amazing tools, because we can do what we are good at and leave computers to do what we aren't. However, some systems have to be designed to mirror us. When we make a user interface, it has to work like we work, or it won't make sense to us. Email is designed to complement our memory so that we don't just lose emails. Reddit is designed to mirror our memory so that it can present us with constant novelty.
That said, I should stress that these two things aren't really opposites. In fact, it would be very difficult to design fundamentally attritional software because eventually you run into the reality that the system is a computer, not a human. Usually, you'll have a reliable system underneath with an attritional interface on top. Reddit, for example, is built on a database and never actually loses information. You wouldn't want it to anyway, because people do link to Reddit threads from elsewhere. The only reason things go missing is because the interface is set up that way.
RSS readers are an example of software crying out for an attritional interface. I don't care about some blog post I've been ignoring for weeks, but it stubbornly persists until I am the one to take the initiative and say "yes, I definitely don't want to read this". Just let me forget about it! Though RSS readers are an easy target, there are many other examples. I previously wrote about browser tabs that accumulate without end. Mobile notification systems could also benefit from a dose of attrition; do I really need constant reminding that some app updated until I specifically dismiss it?
So, if you're working on an interface, I would encourage you to consider: am I trying to complement or mirror the user here? And, specifically, consider if your system should remember things forever just because it can, or whether it might be better to forget.