Ok, this was Feb 17, but it was so rad I wanted to document it. This was three drummers:
Louis Cole
Justin Brown
Roni Kaspi
Basically just doing a jam session. It was super fun to watch these folks trade beats back and forth. (I didn’t take any pics, will get better about this!)
Kontravoid at Lodge Room
Buzz Kull opened and he was fun to watch. Angry screaming dude over some beats with a lot of glitchy visuals projected behind him. Boring description, but it worked for me at the time!
Kontravoid was better than I expected. I was always 50/50 on him, but he put on a good show and I really enjoyed it. So I am now 70/30 on Kontravoid. Highlight was the backing video of him riding a jetski with the mask on (heavily processed of course). Very fun!
Failed to see Javier Santiago at Stowaway
We tried to see this, but the street that the Stowaway was shutdown because the cops were protecting some fancy dinner event that benefited the IDF. Bullshit! After having cops tell me to lie to other cops to get access to the road, we gave up.
Jermaine Paul Quintet
Luckily, Javier was playing with this quintet, so we didn’t miss him fully. This was a show at a space in Sherman Oaks. I forgot who drummed, but I remember really enjoying his work.
Ben Frost at Lodge Room
Totally floored by this show. I didn’t know what to expect. I’m more familar with his ambientish work. For this show, he was working with a guitar player. It seemed like the guitars could trigger percussion samples and Ben Frost was creating the sounds beneath the guitar/percussion combo. It was great! It was loud! It was awesome!
Mononeon at Lodge Room
Mononeon was super fun. His new songs are good, he had a lot of folks up on stage. At one point, there was like 4-6 backup singers.
Minaret Show @ Left Method
Opened with Luke Titus on drums and Elijah Fox on keys. Luke Titus is definitely in my top 5 list of drummers. I love his style and he can play really fast. I sometimes feel like I’m listening to drum and bass, but then I open my eyes and it’s a HUMAN BEING hitting the drums.
It closed with Henry Solomon, chiquitamagic, Billy Voltage doing a set with synths and saxophone. It worked out pretty well! There were some really cool moments during their set.
Been really enjoying the shows Minaret Records puts on. They are often at venues I’d otherwise never make it to. It feels like a scavenger hunt through a new city to me.
Tsutomu Nakai @ Zinc
Part of a series called ‘Guitar Masters’, which isn’t normally my thing, but I had a lot of fun at this show. Zinc Bar in NYC is a cool space. The drummer was really good. I enjoyed his solos quite a bit!
Hiromi’s Sonicwonder @ Blue Note
We made it to the famous Blue Note in NYC to see Hiromi Sonicwonder do their thing. Hiromi was great on keys. I really enjoyed the drummer, I liked that he gave space to be quiet during his solo.
Webb All Stars @ Baked Potato
I went to this to see Danny Carey hit the drums. I was not disappointed. It was great to see him drum in such a small space. You could see him listening the the music and see how focused he was on what he was doing. It was great. The Baked Potato is a really cool venue, looking forward to going back!
These images were generated with a pretty simple program. It takes an input image and compares it against the current painted image. The program generates a new brush stroke and the new error is calculated between the input image and the painted image, if the stroke reduces the error it is kept, otherwise it is rejected. Final image here
I like how this one turned out, the source image of Tank Man was a bit noisy and I think it interacted with the algorithm in a cool way. Final image here.
I wrote this program in Racket. I’ve been meaning to learn a LISP or Scheme based language for a long time. I thought this program would be a fun way to start. The brush strokes use a technique called
Globs which basically connect to circles together with splines in a way that it has first-derivative continuity with the circles. It’s a nice way to generate brush strokes. Final image here.
It was inspired by the Create Applications of Deep Learning class and Ben Garney’s blog posts about writing a proof of concept video chat system. In both cases, the technique of defining, measuring, and reducing error is the core concept that makes it all work. Final brain here.
In the future, I’m planning to apply this to video. It’ll be straightforward to take the previous frame of a video and use it as the starting point for the next frame. I think the effect could be cool. I might also port a simple version of this to a more popular language and give a DorkbotPDX workshop on it. Final clocks here.
iOSBNIZ is a port of IBNIZ by viznut to iPhone and iPad. This allows me to program little audiovisual hacks on my iPhone with minimal typing. IBNIZ features a VM with opcodes that are one character long, ideal for not hitting too many buttons. It also has a FORTH like stack which is a fun puzzle to play with. The VM calls the program for each pixel and pushes time, x, and y onto the stack. Your program uses these values to generate the visuals (and audio). You can also write a full program and ignore this loop as well.
iOSBNIZ is available on the Apple App Store. Download and have fun!
I’ve been meaning to document my work on this project for about year now. TBA 2014 reminded me it is time to actually do it! So here it is:
For the past year and a half I’ve been working with a dance company called BOBBEVY. I’ve been creating graphics that go along with the dance performance called “This is how we disappear”. Here’s a review at Portland Monthly.
Jesse Meija was doing music and got me involved with this project, I’m very grateful!
Version 1
Effects for the first set of performances:
A forest of trees created from a drawings by David Stein.
Particle effects that mimiced the dancers movement.
Particle effects that just fly across the screen
In order to accomplish this, I wrote a piece of software that would do the animation and handle tracking the dancers. There were two versions, the first version was used to perform a few times. Notably Dance+ in 2012 and as part of Experimental Half Hour XXXVII. It consisted of the following pieces:
Freenect, used to interface to the Microsoft Kinect.
Control, used to control the software from an iPad
This version worked ok. Before I started using Control, I had been triggering all of the sequences with keyboard commands. It worked fine, but I had to have a cheatsheet that told me what keys did what. Also, each command just mutated the state of the program, so if you triggered things in a different order you’d end up in different states. This made some rehersals hard, because it was difficult to return the graphics to a previous state. However, with Control, it became easier to use the software. BOBBEVY performed in Milwaukee without me and was able to use the software just fine! For Dance+ the Kinect refused to work in the studio, I think because the temperature in the room was so high. So I ended up “drawing” the dancers with a multi-touch interface in Control.
For the particle effects that followed the dancers, I ended up using blob tracking and distingugishing blobs based on distance away from the Kinect. I liked the stateless design because the dancers would move in and out of view of the Kinect and I feel that keeping track of them properly would have been a nightmare. This created some surprising benefits though. The swarms move between the dancer when their relationship to the Kinect changes and it created some really nice animations. Also, this piece has a lot of tension between the dancers and the particles ended up expressing some tension when the dancers were about the same distance away from the Kinect.
Version 2
Additional effects for the second set of performances:
Static/simple projection mapped screens (similar to my Party House project).
Realtime projection mapped patterns on the dancers bodies
For the second version of the software, I used the following new pieces:
QTimeline, this is a timeline that allows one to control tweens of variables
QuNeo, as much as I liked Control using a touchscreen while not looking at it (I had to look at the dancers for my cues!) is not ideal. A physical controller allows you to rest you finger on the button/slider you need to push without triggering it. (The cool kids pronounce it keeen-wah).
For TBA, 3, yes 3! Projectors. Two of them were used to cover the wide background, and one was used to project onto the dancers themselves.
The timeline solved many problems for me. It took what I used to have hard code in the application (fade times, animation speed, etc) and moved it to a data format. The editing GUI was nice to have as well. A new version of Cinder that made using multiple displays easier to use was really nice to have as well. I didn’t need to mirror my desktop screen anymore, which meant I could display debug and other helpful info on my screen. The QuNeo also allowed me to directly control ramp parameters which meant I didn’t need to rely on predetermined fades as much. This also allowed me to be more engaged with the visuals which was really fun. I think the trick to this will be finding the right balance between direct control and triggers to presets. It is probably the same balance electronic musicians search for.
The newest effect for the second run was the projection mapped dancers. In order to accomplish this, I was going to have to find the dancers with the Kinect and then project onto them as close as possible. I used the vvvv patch from here as a starting point to learn how to calibrate my projector with the Kinect. In the end, I wrote my own calibration code because it fit the setup workflow a bit better.
The projection mapped dancers worked pretty well. I was really excited to see them turn into just an indistigushable mass at moments and then turn back into dancers the next. I think this is what projection mapping should do: transform objects and confuse you, then bring you back to reality. I hope to do more of this in the future!
On March 25th, 2013 the Church of Robotron gave a sermon at DorkbotPDX0x0A. It focused on the mobile Church of Robotron installation we did at Toorcamp 2012. James gave live sermons interspersed with presentation snippets about the installation. We projected our faces onto old TV’s, an homage to Dr. O’Blivion and the Wizard of Oz. I mutated the Party House project to support webcams and applying shaders to the input stream. I was able to control the video source and shaders at realtime during the sermon. We also had slides. I think it was a nice mix of a live human being, psuedo live humans being projected, and standard slides. I hope to do more sermons in the future!
Use MAME’s debugger to reverse engineer and extend old games
For the Church of Robotron’s installation at Toorcamp 2012, we needed to be able to trigger physical events when game events happened in Robotron 2084. A quick summary for context:
We had an Altar that contained a Linux box that ran MAME and Robotron 2084
We had joysticks that acted as HID devices
Player would kneal down and play the game. As events happened in the game (player death, human death, lasers firing), we would trigger physical events.
We choose to use MAME’s debugger to detect game events and notify other pieces of software when they happened. This is a quick tutorial for others (and a reminder to ourselves) if you’re interested in doing similar things. We’re going to find out how to detect player death!
Start mame in windowed mode with the debugger active:
mame -window -debug robotron
You’ll see two windows appear, one is the emulation, the other is the debugger window. You can type “help” at the console and get good documentation on the debugger capabilities. You can also click on the down arrow to access a menu that allows you to run the emulation and later pause it at different points.
Memory dump
The first thing to do to find the memory location of the number of lives the player currently has. One way to do this is to take advantage of other peoples work. ;) Sean Riddle has a great site with a lot of this information already available! The other way is the manual way. Let’s do that!
Start the game by hitting F5
Start playing a game, by hitting 5 to drop a coin, and 1 for one player game.
Pause the game by hitting F8
In the debugger console, type “Help Memory”
Ah, nice, there’s a dump program memory command. Type: “dump lives3.txt,0,0xFFFF”
Press F5, then click on the emulation screen
Hit F2 to access the Operator Screens of Robotron.
Hit F2 until you see “Game Adjustment”
We’re going to change the default number of lives to 5 and see if we can find the difference in the dump.
Press “D” to go down to “Turns Per Player”
Press “I” to change the number of lives to “5”
Press F2
Reselect the debugger screen
Press F8 to pause
Press the Down Arrow, Select Reset, then Hard
Start playing a game, by hitting 5 to drop a coin, and 1 for one player game.
Let the game get to the main screen, then pause the game by hitting F8
Type “dump lives5.txt,0,0xFFFF”
Now we have dumps of the game with 5 lives and 3 lives as the default. Let’s use a diff tool to see what changed in memory! On OSX, you can use Filemerge or standard old “diff”. Scanning through the differences, we don’t see anything that correlates to 5 lives and 3 lives. But we do see two spots that seem to correlate to 4 and 2 lives:
2722,2727c2722,2727
< AA60: 18 07 FF 06 5F 04 1D AA 58 39 CD 02 00 00 00 00 ...._...X9......
---
> AA60: 18 07 FF 06 43 03 E9 AA 58 39 CD 04 00 00 00 00 ....C...X9......
Type “help watchpoints”, a Watchpoint will cause program execution to stop whenever a memory address is written to or read from.
Find out more info on setting a watchpoint by typing “help wpset”
Let’s set our watchpoints
Type “wpset 0xAA6B,1,w” This will set a watchpoint that is triggered whenever our first memory address guess is written to
Type “wpset 0xBDEC,1,w”
Run the game
Assuming you didn’t die, you’ll notice the first watchpoint has been hit and it doesn’t have anything to do with player death. Lets clear that one.
Type “wpclear 1”
Start the game again, and play hard until the Mutant Savior dies.
You’ll notice the second watchpoint is triggered!
We’ve found the memory location for player lives! We can also take note of the instruction pointer address. It’s at 27AC, let’s set a breakpoint here.
Setting breakpoints
Type “bpset 0x27AC”
Type “wpclear”
Continue playing. You’ll notice the breakpoint is hit for every player death except the last one. That’s a bummer! Let’s walk up the stack and figure out a better spot. On some CPU types, you can do just that, but with the 6809 that’s powering Robotron you can’t. So we’ll turn the trace file, run the game until we hit our breakpoint, and then close everything. We can step back up from the bottom of the file and see if there are interesting addresses to set breakpoints on.
Trace file
Type “trace trace.txt”
Start a game and die
In the debugger type “traceflush”
Then type “trace off”
Open the trace.txt file in an editor. Go all the way to the bottom. You’ll see a nice trace with repeated blocks called out. I use these blocks as starting points for new breakpoints. If you scan up the file, you’ll see repeated addresses of DC56. Skip above those, they’re not interesting. We want to find the first branch of code that doesn’t repeat. Just scanning here are the addresses that seem interesting to me:
0xD676, 0xD8BC, 0xD1ED, 0x30FE
Let’s set some breakpoints:
Type “bpset 0xD676”
Type “bpset 0xD8BC”
Type “bpset 0xD1ED”
Type “bpset 0x30FE”
Start running!
You’ll notice D1ED gets triggered alot, so disable that one with the “bpclear (breakpoint number)” The same thing happens to D8BC and D676. But, 0x30FE seems to work!
Summary
I hope this helps people get used to the MAME debugger. After you have this information, you may want to do something with it. We choose to broadcast game events over UDP. You can see how by looking at our Github repo here.
Party House is a projection mapped dollhouse that was quickly thrown together for the
DorkbotPDXOpen Mic Surgery event. It was written in a few days using the Cinder library. The source code is available here: PARTY HOUSE REPO.
It was super fun to perform. It wasn’t a complex or hard thing to do, but sitting up on stage pumping my fist and hitting space bar over and over again to the beat was fun. I’d definitely like to do more of this kind of “work”.
I had originally followed this Madmappertutorial on mapping a building with madmapper and creating a video to go along with it in After Effects. It was a great introduction. I ended up running into problems in my case because the doll house had lots of overhangs and different planes to work with. Maybe if you’re projecting on a huge building, the building dwarfs the projector by enough that you can treat it as one plane. But in this case, I had to create a bunch of planes to make everything line up nicely.
This is also the first time I’ve used After Effects at all. It was nice to just make some videos and them layer them on the fly during my set. I definitely want to spend more time learning After Effects for future projects.
I ended up writing my own software to do the mapping because I’m a hacker and just wanted to. The code isn’t that great, but I do think the EditorViewport and ControlPoint classes are nice enough to re-use in future projects. I also updated the code to use the newest branch of Cinder, so it now supports multiple windows. It was also fun to start using C++11 features.
The Church of Robotron is a future looking group that is attempting to save the last human family. We did a large pop-up church installation at Toorcamp 2012. It had the following features:
During gameplay Jacob’s Ladder and Sparker were running
Fog machine randomly triggered
Lasers fired in real life when enforcer shots were fired in game
Rotating flapper near player hands spun when humans were killed by Robotrons
Animated gif of your face at time of death in game which was displayed on a leaderboard in the other room.
Bright LED flash on death, this allowed us to get a decent picture from the webcam on death and added to the players disorientation on death
Kneeler base which detected players and controlled lights.
Readerboard which displayed top player and witty statements
Randomly shuffling sermon videos
Lit totem pole
Reading room which contained stickers, chick tract, and a zine.
I hacked MAME to expose debugging events (breakpoints, memory watchpoints) to clients via sockets. I also did some reverse engineering of the ROM to find out game events (deaths, high score, etc). To do this, I built upon the work of the great Sean Riddle. I also used OpenCV to capture the players faces when they died which were posted to the high scores page. We had about 15 people working on various pieces of the project. It was great to see it all come together and it was great to see people’s reaction to it!
Source code is available here. More tech details will be written at some point. I’m planning to write a quick post about mame hacking at some point soon.
“This is how we Disappear” is a 25 minute dance performance created by the BOBBEVY dance company. I used a lot of random technologies: Cinder, Microsoft Kinect, libfreenect, OSC, and Control to create realtime dancer reactive visuals to go along with the performance. We performed it quite a few times and my favorite time was at Hollywood Theatre as part of Experimental Half Hour. It was great to see the graphics projected so huge!
I’ve learned quite a bit about what’s required to do visuals for a performance from this project. It’s definitely influenced everything I’ve done afterwards. I’m hoping to do more of this type of work in the future.
Some post-mortem notes:
Using the Kinect is a performance space is a pretty harrowing experience. If there’s a lot of reflection or IR noise in the room, the Kinect can just die. I think there are some tweaks you can do in libfreenect to allow one to continue to get data, but it’s likely to be too noisy to work. I ended up creating a fallback that allowed me to “draw” the dancers with my figures with a Control sketch. But that method isn’t ideal.
I initially triggered everything with keyboard keys, but it became hard to remember what did what after a while (even for a powah user like myself). Have an iPad user interface was much nicer and allowed the BOBBEVY crew to run the show themselves when they performed in Milwaukee. In theory, this would also let me trigger the cues far from the computer which would give more options for where the computer and Kinect go. In practice, the UDP packet loss when doing Adhoc networking seemed to high to rely on this. It’d be nice to have a TCP transport as an option in the future.
I also added Bonjour discovery to everything, this made it much much easier to get everything configured before a show. I recommend this to people doing multi-machine work.