Okay day three.  I had a vision in the shower.  That’s usually how it happens.  Change of plans somewhat.  Today I shifted focus from gestural mapping cueing to programming.  The “ModBots” / or the bots that are mainly percussive and tend to hang from the ceiling at LEMURplex / are easy enough to control using simple thresholding detection while wearing the TwitchSet.  But the bots with notes, i.e. the XyloBot and the GtrBot are much harder for me to control with any sort of fine resolution.  The GtrBot, for example, takes MIDI notes of 36-81, which means 45 note resolution.  I had been trying to control notes variations using one axis of the TriAx accelerometers on the TwitchSet.  An accelerometer is not a tilt switch and is a bad substitute for one in this case of trying to get 45 steps of resolution with a 180˚ rotation of my shaky hand.

Besides, I’m not a musician.  I’m not going to become a musician just because I have the pleasure of spending the next two weeks with musical robots.  I’ve decided to focus on the performative aspect of why I’m doing the residency.  Hell yes.  I think I’ve got something going here.  Below is a short clip documenting the partial results of today’s programming.  I didn’t write the song.  And I’m not going to tell you what song it is based on, or who wrote it.  That’s the surprise for the performance.  In progress: