A human that makes things, sometimes they don’t fall apart

Doctrine of Error

Eye painted with the Doctrine of Error from Brian Richardson on Vimeo.

This images were generated with a pretty simple program. It takes an input image and compares it against the current painted image. The program generates a new brush stroke and the new error is calculated between the input image and the painted image, if the stroke reduces the error it is kept, otherwise it is rejected. You can view the final image of the eye above here.

Tank Man – Painted with the Doctrine of Error from Brian Richardson on Vimeo.

I like how this one turned out, the source image of Tank Man was a bit noisy and I think it interacted with the algorithm in a cool way. Final image here.

Jellyfish painted with the Doctrine of Error from Brian Richardson on Vimeo.

I wrote this program in Racket. I’ve been meaning to learn a LISP or Scheme based language for a long time. I thought this program would be a fun way to start. The brush strokes use a technique called Globs which basically connect to circles together with splines in a way that it has first-derivative continuity with the circles. It’s a nice way to generate brush strokes. Final jellyfish image is here.

brain painted with the Doctrine of Error from Brian Richardson on Vimeo.

It was inspired by the Create Applications of Deep Learning class and Ben Garney’s blog posts about writing a proof of concept video chat system. In both cases, the technique of defining, measuring, and reducing error is the core concept that makes it all work. Final brain here.

the-persistence-of-memory painted with the Doctrine of Error from Brian Richardson on Vimeo.

In the future, I’m planning to apply this to video. It’ll be straightforward to take the previous frame of a video and use it as the starting point for the next frame. I think the effect could be cool. I might also port a simple version of this to a more popular language and give a DorkbotPDX workshop on it. Final clocks here.


iOSBNIZ is a port of IBNIZ by viznut to iPhone and iPad. This allows me to program little audiovisual hacks on my iPhone with minimal typing. IBNIZ features a VM with opcodes that are one character long, ideal for not hitting too many buttons. It also has a FORTH like stack which is a fun puzzle to play with. The VM calls the program for each pixel and pushes time, x, and y onto the stack. Your program uses these values to generate the visuals (and audio). You can also write a full program and ignore this loop as well.

iOSBNIZ is available on the Apple App Store. Download and have fun!

Source code is avaiable on Github

IOSBNIZ Features

  • IOSBNIZ features a keyboard which is tailored for the OPCODE structure of IBNIZ. Characters are grouped together by function.
  • Holding a button down on the keyboard will display some HELP to remind you what the opcode is.
  • It keeps your last program for fast restarting of hacks.
  • You swipe right to display your program without code and swipe left to bring the editor back.
  • Swiping left will display help.
  • Swiping left again will allow you to save and load programs.
  • Shake the phone to switch between swipe mode and “U” opcode mode. In U opcode mode your touch location is sent to the U opcode.

This Is How We Disappear

I’ve been meaning to document my work on this project for about year now. TBA 2014 reminded me it is time to actually do it! So here it is:

For the past year and a half I’ve been working with a dance company called BOBBEVY. I’ve been creating graphics that go along with the dance performance called “This is how we disappear”. Here’s a review at Portland Monthly.

This is how we disappear - projection edit from Brian Richardson on Vimeo.

More behind the scenes information after the break!

Church of Robotron Sermon

On March 25th, 2013 the Church of Robotron gave a sermon at DorkbotPDX 0x0A. It focused on the mobile Church of Robotron installation we did at Toorcamp 2012. James gave live sermons interspersed with presentation snippets about the installation. We projected our faces onto old TV’s, an homage to Dr. O’Blivion and the Wizard of Oz. I mutated the Party House project to support webcams and applying shaders to the input stream. I was able to control the video source and shaders at realtime during the sermon. We also had slides. I think it was a nice mix of a live human being, psuedo live humans being projected, and standard slides. I hope to do more sermons in the future!

Use MAME’s Debugger to Reverse Engineer and Extend Old Games

Use MAME’s debugger to reverse engineer and extend old games

For the Church of Robotron’s installation at Toorcamp 2012, we needed to be able to trigger physical events when game events happened in Robotron 2084. A quick summary for context:

  • We had an Altar that contained a Linux box that ran MAME and Robotron 2084
  • We had joysticks that acted as HID devices
  • Player would kneal down and play the game. As events happened in the game (player death, human death, lasers firing), we would trigger physical events.

We choose to use MAME’s debugger to detect game events and notify other pieces of software when they happened. This is a quick tutorial for others (and a reminder to ourselves) if you’re interested in doing similar things. We’re going to find out how to detect player death!

Party House

Party House is a projection mapped dollhouse that was quickly thrown together for the DorkbotPDX Open Mic Surgery event. It was written in a few days using the Cinder library. The source code is available here: PARTY HOUSE REPO.

It was super fun to perform. It wasn’t a complex or hard thing to do, but sitting up on stage pumping my fist and hitting space bar over and over again to the beat was fun. I’d definitely like to do more of this kind of “work”.

Read below for my post mortem notes.

Church of Robotron

The Church of Robotron is a future looking group that is attempting to save the last human family. We did a large pop-up church installation at Toorcamp 2012. It had the following features:

  • During gameplay Jacob’s Ladder and Sparker were running
  • Fog machine randomly triggered
  • Lasers fired in real life when enforcer shots were fired in game
  • Rotating flapper near player hands spun when humans were killed by Robotrons
  • Animated gif of your face at time of death in game which was displayed on a leaderboard in the other room.
  • Bright LED flash on death, this allowed us to get a decent picture from the webcam on death and added to the players disorientation on death
  • Kneeler base which detected players and controlled lights.
  • Readerboard which displayed top player and witty statements
  • Randomly shuffling sermon videos
  • Lit totem pole
  • Reading room which contained stickers, chick tract, and a zine.

I hacked MAME to expose debugging events (breakpoints, memory watchpoints) to clients via sockets. I also did some reverse engineering of the ROM to find out game events (deaths, high score, etc). To do this, I built upon the work of the great Sean Riddle. I also used OpenCV to the players faces when they died which were posted to the high scores page. We had about 15 people working on various pieces of the project. It was great to see it all come together and it was great to see people’s reaction to it!

Source code is available here. More tech details will be written at some point. I’m planning to write a quick post about mame hacking at some point soon.

BOBBEVY, This Is How We Disappear

“This is how we Disappear” is a 25 minute dance performance created by the BOBBEVY dance company. I used a lot of random technologies: Cinder, Microsoft Kinect, libfreenect, OSC, and Control to create realtime dancer reactive visuals to go along with the performance. We performed it quite a few times and my favorite time was at Hollywood Theatre as part of Experimental Half Hour. It was great to see the graphics projected so huge!

I’ve learned quite a bit about what’s required to do visuals for a performance from this project. It’s definitely influenced everything I’ve done afterwards. I’m hoping to do more of this type of work in the future.

Some post-mortem notes are after the break:

The Missing Link

I created the initial firmware for the Missing Link. It’s an OSC receiver that outputs MIDI. The purpose is to hook up OSC clients such as TouchOSC or Control to older synths. The source is available here.

Read below for my post mortem notes.

Processing Class

DorkbotPDX Processing Workshop

I gave a class on using Processing to create simple graphics. Also quickly “covered” using the OpenCV addon to detect faces. Notes, slides, etc are available here.

It was the first time I’ve taught a class in quite a while. I got a fairly positive response (from my friends, so they have to be nice to me!). I think I did a good job setting up mini-goals that took 20-30 minutes to accomplish. The idea was to give everyone little victories quickly so they’d be motivated to stick around and conquer more.