Final project for COMP_SCI 497 (RTVF 376-0-20): Digital Musical Instrument Design | Northwestern University
Headhunter is a head-tracking midi controller. It allows user to use their head to control a midi-compatible instrument.
Headhunter uses sensors in smartphones to do head tracking. It translates these data to corresponding MIDI messages using Max. User can easily customize Headhunter by remapping head movements to different midi messages.
My inspiration for Headhunter comes from Keith Jarrett, one of my favorite jazz pianists.
Besides his constant and notorious moaning, Mr. Jarrett also loves to make these dramatic head movements when playing the piano.
People found these movements to be really distracting because they have nothing to do with the music. This makes me think: what if these seemingly redundant movements could control some aspect of the sound? What if they become part of the musical performance? This way, the audience would no longer find these movements to be distracting but find them to be engaging because now they would receive auditory feedback when they see these movements. This would also give the keyboard player more degrees of freedom, and allows them to do things that are not possible before: bending notes while playing two-hand chords, and changing parameters without having to touch any knob or sliders.
After my initial idea, my plan is to make a proof of concept first, and if it works, I will continue to commit to this idea and bring this instrument to life.
First, I did some research on head tracking and found this open-source software called opentrack. It accepts a variety of head tracking input, which includes smartphones, so I (as well as other broke musicians who might be interested in this instrument) don’t have to pay for a dedicated head tracking headset.
Next, I tried to figure out how to convert the yaw-pitch-row data from opentrack into something max could recognize. After some research, I decided to use the built-in virtual joystick output protocol, which would encode the head tracking information to a virtual joystick. Then, Max could deal with the virtual joystick input.
The final step is to figure out how to properly map three virtual joystick inputs to midi messages. These inputs are integers from 0-65535, and midi note bend messages and continuous control messages are all values from 0-127. A simple linear mapping wouldn’t do because, first, it is really hard to move one’s head to reach both value 0 and 65536; also, if tracking is too sensitive, it will be hard to stay in tune and accidental changes might occur. These kind of problems require lots of trial-and-error parameter adjustment. After lots of experiments, I wrote several dedicated javascript objects in Max to handle these rather complicated mapping rule sets.
Currently, if one wants to change hyperparameters in the mapping rule sets, they have to understand and edit the javascript code. In the future, I would like to make sliders or knobs in the Max patch that change these hyperparameters directly. This would make the parameter tuning process easier.
Also, I think it would be great if all these separate software used can be made into a single executable file. This could be difficult but important because this would make Headhunter much more user-friendly.
Finally, I am thinking of replacing the smartphone with a dedicated orientation sensor. It would make the headset lighter while eliminating the need to set up app on the smartphone.
In this demo, I am playing a Rhodes and an ARP Solina string ensemble. The volume of the ARP Solina is set to 0 at the beginning.
I mapped the yaw axis to the pitch bend. So you will hear this vibrato/out-of-tune effect when I turn my head left and right.
I mapped the pitch axis to the volume of the ARP Solina, so you can hear a synth string sound fading in and out as I look up and down.
ctlout
and bendout
objects to the midi channel you created in step 5. Select vjoy device
as your input, and click poll 10
to start the Max patch.Download The Headhunter Max Patch and Javascript Files