Percussion Sequencer
2024
For the final project of my interactive media design class in fall 2024, I chose to make a percussion sequencer with physical interfaces for sequencing, synthesis and effects parameters. Going into the project, I wanted the sequencing interface to be minimal but intuitive, and the synthesis parameters to be limited yet versatile. In terms of the effects interface, I wanted to encourage the user to feel out the parameters instead of being overly intentional, making for a more expressive tool. My broader goals included striking the right balance between predetermined values and user control, and designing my sequencer with the ability to integrate into a greater system.
In the process of assembly, I determined that I wanted to allow users to scroll through and toggle on and off the 16 steps of the sequence, making only a knob (in this case a rotary encoder) and a button necessary to edit the sequence (which would be represented by LED strips). I also narrowed down the synthesis parameters to include the most essential for my device: tone (represented as the ratio of a triangle wave to white noise), pitch, and decay length. Since all of the aforementioned components drew inspiration from conventional electronic music controllers, I chose to create a more novel system for post processing effects. I elected to attach an Inertial Measurement Unit (IMU) and a Force Sensing Resistor (FSR) to a small foam ball, with the former getting sensor data from the rotation of the ball and the latter getting sensor data from the pressure applied to the ball. This was done with the intent of creating a more visceral interface, in contrast to the precise controls often afforded by digital and electronic musical tools. All sensor data is processed by an Arduino and sent to Max, where clocking is initiated and sound is synthesized.