Sam Gentle.com

Project Brain

I thought I'd write up the work I did on Brain Light, aka Project Brain, in 2015. I've covered bits of it in earlier posts, but I never actually described the project as a whole. It was a collaboration between myself, artist Laura Jade and neuroscientist Peter Simpson-Young. The goal was to create a sculptural visualisation of brain activity using an EEG headset.

There were some interesting challenges to this project. Firstly, I wanted to be sure that the setup would be reliable. Too many art installations run on laptops with custom programs you have to fiddle with and set up in a particular way before you can get anything going. That's not just a matter of adding complexity for the artist, there's also a good deal of practical, selfish motivation; I didn't want to have to be there every time it was running. Basically, I wanted it to act like hardware, not software. For this reason, I very quickly settled on using a Raspberry Pi. They're replaceable (we bought 3), relatively cheap, and powerful enough for the kind of processing we wanted to do.

Unfortunately, the Epoc platform is not great. Despite the majority of their users being researchers and hobbyists, they have a tragically closed-platform mindset. You pay extra for installing the SDK on different platforms, nothing is open source, you just get a big binary blob and hope it works on your hardware. Since the SDK wasn't working on the Pi (or any ARM hardware) and their general platform strategy gave me the creeps, I decided to go it alone and just use the raw data. You have to pay extra to do this, but it seemed worthwhile given that it removed our dependency on their platform.

Free to use whatever software I wanted, I quickly settled on Python. Python is portable, easy to write, reasonably fast, and most importantly has a big following in the scientific community. I got the data in via a library called Emokit, and from there I really had free reign over any kind of signal processing SciPy had to offer. It took a bit longer doing it this way, because I had to learn what each of the components did rather than just using something off the shelf, but that understanding was actually helpful in the long run when I wanted to make changes.

With the data processing under control, it was really a matter of what to do with it. The main thing was to experiment, visualise and try different options so we could figure out what would look best on the sculpture. To give us that flexibility, I made the various processing components separate: one component handled reading the data off the device, another did the processing to turn that into brain frequencies, another summed those frequency regions up into bands, and so on. These were all separate processes connected together via zeromq sockets, and they could be started or stopped at any time.

Finally, on top of that, I made a web interface to show us various kinds of substantially non-artistic debug output. This included the full frequency chart, as well as sensor contact quality info for positioning the headset. I also added an example brain ball visualisation that substantially informed the final version. Later on, I added a couple other kinds of output, audio and some nifty non-linear measurements like fractal dimension and entropy. Although these were interesting and indicative, the ultimate goal was to light up the perspex sculpture.

We had originally planned to light the brain using bright LEDs and optic fibre, but it turned out to be very difficult to find RGB LEDs that were bright enough while still being individually addressible. We wanted something that we could run off DMX, which also made it more difficult. Something Arduino-based like NeoPixels would have been a possibility, but would also have involved more custom hardware and more ways for things to go wrong. In the end we just used a big projector mounted above the brain and projected the light directly on it.

For that to work, we needed the Pi to put out some nice graphics over HDMI. Out of the various drawing options, I eventually went with raw OpenGL (via pyopengles). I didn't want to have a whole X server and all its associated bloat, especially given the EEG processing was a substantial load on the Pi already. I figured that if I could write the visualisation directly in shaders, it would run pretty fast while still looking good. The shaders turned out to be pretty tricky. I initially wanted to do a more complex particle system, but I settled on a pretty neat cellular automata system.

The final visualisation was kind of a mashup of the cellular automata and the original brain ball with a bunch of tweaks on top. Instead of being smooth gradients, the colours needed to be sharp contrasts so that they would show up well on the perspex. The projector was also mounted on an angle, so the perspective of the visualisation had to change from top-down to side-on. Finally, there was a lot of tweaking with Laura to find the best colour values, automata speeds, sizes, positions, and so on. This was actually quite a fun process because all the hard work was done, and I could just rely on her aesthetic judgement while I concentrated on fiddling with numbers.

All up, the project went quite well. It had an exhibition at Culture at Work, and appeared at events in the Powerhouse Museum and Museum of Contemporary Art. It also got some nice press from Vice and The Telegraph, and even ended up on ABC's Catalyst (as did Peter). My original goal of it being low-maintenance was also a success; I didn't touch it during the entire exhibition run, and I believe it's now on tour in Europe without needing any nightmare intercontinental tech support.

The code is up on GitHub for the curious. You can also check out the series of posts I wrote while I was working on it: Brains, Brain Ball, Automata, Brain Light, Brain Sounds and Non-linear.