30-year predictions

I recently watched Alan Kay's How to Invent the Future (and part 2), and one thing I thought was particularly interesting was his idea of trying to invent the future from the future. That is to say, rather than extrapolating from now to the next 10 years, extrapolate to the next 30 years and then work backwards. That way you're finding an incremental path to a big vision, as opposed to ending up with a small and incrementalist vision.

In that spirit, I thought I'd come up with some 30-year predictions:

  1. Memetics is redeveloped as a practical discipline, becoming the primary driver behind development of entertainment (games, social media) and social influence (politics, PR). Dopamine-farm distraction technology becomes worryingly effective. Deep questions begin to emerge about who is actually making decisions when we're all so easily manipulated on such a large scale. Anti-memetics and non-democratic forms of government become serious areas of consideration.

  2. Software development splits into distinct disciplines, both vertically (architects vs engineers vs builders vs technicians) in its current general-purpose focus and horizontally away from general-purpose towards different special-purpose forms for different domains. This isn't so much DSLs as distinct paradigms like Excel formulas, Labview or MaxMSP. Most people program computers if you consider a wider definition that includes this "soft" programming.

  3. Elimination of all passive media, ie all documents/images/movies etc become executables containing both the data and the code to interpret them. This is partly for DRM and partly to enable new interactive forms. Sandboxing and trust models replace Turing-restriction for software safety. (You can already see this happening on web and mobile). Most media becomes dynamic or interactive in some way, even if the main experience is still non-interactive, eg movies with an embedded chat box.

  4. Human interaction becomes almost entirely machine-mediated, leading to people accustomed to complete control over their social interactions. Huge industry of asymmetric transactional relationships (livestreamers, social media personalities, personal services etc) with defined parameters. This leads to a healthy but small market of countercultural "talk to someone you might not like without being able to immediately get rid of them" services and quaint local-community-based interaction. Most people will be happy enough only interacting with others in controlled circumstances.

  5. Semi-autonomous companies dominate the market as machine learning algorithms demonstrably trounce human judgement in investment, management, and executive-level strategy. People are still involved to do work that is not currently automatable, but that field shrinks from the top as well as the bottom. There is a commensurate increase in value for creativity, which continues to elude AI. Most people are not employed, either spending their time on human-centric creative pursuits like art or sunk deep in a well of machine-optimised distraction.

The blank slate

I've been thinking a bit about this month's Conventional Wisdom. What's the point of it? I mean, maybe if I've forgotten about something for long enough it should stay dead.

But the problem is, while I'm sure some things do just drop completely out of your memory, my experience is that they're not really forgotten; little bits of them hang around. The ghosts of discarded-but-not-quite-abandoned projects come back to haunt you when you're out for a walk, trying to sleep, or in the shower thinking about something else.

In that sense, then, better late than never is really an extension of statelessness; rather than having all your projects in some indeterminate state, the goal is to drive them to completion or destruction. Either you're doing them or not doing them, but maybe kinda doing them is an extra load on your working memory that you don't need.

To that end, I've started doing something substantially out of my comfort zone: closing windows. By way of explaining how much of a change this is, I usually have probably a hundred tabs and windows open spread across 10 virtual desktops. I do this not because I'm using all those windows, but because it's easier if I want to pick up where I left off on a project.

But that, too, is yet more state. Every Chrome window full of tabs I intend to read later is an extra burden I have to carry around. Every Sublime Text window full of notes I should probably file away is a drain, not just on my computer's memory but my own.

Worst of all, this has a very obvious effect on my focus. When I start working on something, I don't have to just find the windows that are relevant to that project, I also have to ignore all the ones that aren't. At the critical and vulnerable time between tasks, it's so easy to see some stray email or article I've been saving up and get distracted.

Anyway, no more. I've realised that rather than starting with lots of junk and subtracting my way to what I want, I should instead start with nothing and add to reach what I want. In other words, no old browser windows. In fact, no old windows of any kind. I want to sit at my computer and see an empty screen waiting for me to fill it with something useful. A blank slate.

Puzzle circuits

A Puzzle Circuits NOT, AND, and XOR gate

As part of my residency at Culture at Work I'm working on alternate ways of thinking about computing that are more tactile, explorable and concrete. I want to try a bunch of different ideas, turn them into prototypes, and see what can best help me pull back the silicon curtain.

My first one is called Puzzle Circuits. This is designed to be a version of digital logic that's easy to play with and understand. Each logic gate, switch, or component is represented as a piece of a jigsaw puzzle. Inputs and outputs are on the edges of the piece and the jigsaw tabs make the connection, preventing you from mixing up outputs and inputs.

All the traces are illuminated with LEDs when active, so you can always see all the state in the system at once. Actual digital logic has some extra complexity that you could sweep under the rug, including how to get power to all the gates, pull-down resistors etc. The idea would be that you could just concentrate on the logic gate level of abstraction without worrying about the electrical engineering.

This would take two forms: a physical representation made of actual puzzle pieces cut from PCBs with logic gates soldered on to them, and a software representation that you can use in a web browser. The browser version would be easy to prototype with, so I'll start with that, but the goal is to have both.

Other than allowing people without the hardware the ability to play with the ideas, this would also serve an important purpose: making meta-pieces. You could assemble a bunch of virtual pieces on your computer, then write that logic into a programmable puzzle piece that you can use like any other.

Far from being a novelty, I think the ability to make pieces that are made of pieces, that themselves are made of pieces, is a fundamental part of what makes computation what it is. There is a kind of fractal self-similarity that lets you express information at any level of abstraction.

That said, the whole goal is that it can scale with your level of interest. You can get started just plug some buttons into some logic gates and a speaker and have fun bleeping and blooping, but when you're ready to do more complex things the idea can increase in complexity to match your ambitions. A Puzzle Circuits half-adder


I was trying to think of a good way to explain to a friend recently why programming and meetings are so incompatible. The canonical reference for this is Paul Graham's Maker's Schedule, Manager's Schedule. Basically, programming needs big blocks of uninterrupted time, but managers tend to operate on small blocks. For managers, meetings are just another small block, but for programmers their whole big block is interrupted. That seems about right, but what it doesn't explain is why.

I'd like to propose another metaphor: programming is sleep-like. Sleep is a fairly odd phenomenon compared most of our behaviour. First, we have to engage in a certain degree of ritual to even get it going. We lie down somewhere dark and quiet, remove all external stimulation, empty our minds, and wait. Even when it comes, sleep is easily interrupted, and many short blocks of sleep don't work as well as fewer long blocks, no matter what the Ubermenschen will tell you.

Sleep comes in stages. The first is an essentially transitional light sleep, the second is a medium sleep that makes up the majority of the duration, the third is the deepest, so called slow-wave sleep (SWS). Finally, the much-hyped REM sleep, which is actually closest to light sleep in terms of brain activity. It used to be thought that REM was the most important stage, and then SWS, but presently the consensus is that they both work together.

Interestingly, programming also seems to involve stages. In my experience, the early stages involve acclimatising yourself to the code, getting all the different parts in your head, and generally setting up. With that done, you can then start thinking deeply about the problem. Once the solution occurs to you, you write the code for it, which looks like the hardest part but is actually the easiest.

Now, this is just a metaphor; there's no similarity at a neurological level between sleep and programming. That said, there are functional similarities. Being interrupted in the light stages is not so bad, but being interrupted in the deep stages means basically starting over. A certain amount of time is required to get through all the stages, and it has to be in one continuous block.

Most significantly, I feel like the main quality both sleep and programming share is that they thrive in a vacuum. To sleep effectively you have to get rid of everything that isn't sleep, and so too with programming. The main predictor of both good code and good sleep is long stretches of uninterrupted time.

Of course, programming isn't the only thing with this quality. Really, I that that any focused application of your mind would have similar characteristics, be that programming, writing, studying or something else. I suspect it's some unifying property of the way our brains work best; associations build on themselves, so you need to keep the spurious ones quiet and give the relevant ones time to build up.

40 Covers of Skrillex

In the spirit of better late than never, I'm releasing a project I did back in 2012, but haven't published until now: 40 Covers of Skrillex.

This originally started as an idea vaguely inspired by Kutiman's Thru-you. At the time, Skrillex covers were really popular on YouTube, especially Scary Monsters and Nice Sprites. After hearing a few I started thinking, what if you could remix all those covers together to make a whole new version of the song?

I spent the better part of a day just listening to and downloading as many covers as I could find, which ended up being about 50. I took a bunch of notes on which instruments were in the cover, sections that I thought sounded particularly good, etc. I started vaguely thinking about how they might fit together, but didn't really anything concrete in mind.

After that, I started loading stuff into Reaper. Originally I was thinking I'd do the audio editing there and then figure out the video afterwards, but actually Reaper's video editing was powerful enough for my modest needs. I put in the original track as a reference and just started playing around to see what worked.

The following two days or so just consisted of repeatedly adding new covers, figuring out where they'd fit, listening to the sound of different parts together, getting ideas for what might work well next and then starting the process again. A lot of the best ideas came from just trying something silly to see how it would sound; a particular favourite was the floppy disk + guitar solo.

I'm not entirely sure why I didn't release the video at the time. I definitely wanted to get to 50 covers, but I'm not really sure how I was going to get them or where they would have gone. I was also a little unhappy with the audio; I had to lay a pretty fat compressor over everything because there was such a varied amount of sound, and I thought it sounded kinda muddy in the loud sections.

I've had this project in the back of my head for the last five years. Every now and again I would think "oh, yeah, I should finally get around to fixing whatever's wrong with that and finish it". Funny thing is, when I listened to it, it seemed totally fine. What I ended up uploading was completely unchanged from the version I wasn't ready to release five years ago.