This project was developed for a Physical Computing class as an introduction to both coding and digital fabrication. It explores musical hardware and interaction design through the creation of a custom DIY MIDI controller.

Constructed from laser-cut acrylic, the controller features 8 buttons, 2 sliders, and 2 rotary knobs, along with a Time of Flight (ToF) proximity sensor that enables gesture-based control. The device is powered by an Arduino microcontroller and programmed in C++, enabling responsive, real-time MIDI routing and modulation.

Complementing the hardware is an audio-reactive visualizer built using p5.js. This component translates incoming MIDI signals and sound data into dynamic, generative visuals, creating a synchronized audiovisual experience.

As a first experiment in embedded electronics and creative coding, the project demonstrates how accessible tools can be used to prototype expressive, multi-sensory instruments. It reflects an early exploration into how physical computing, sound, and visual media can converge to expand the possibilities of performance and interaction.

Previous
Previous

Signal Matter