URTUNE


Processing/JAVA | FALL 2015

Composing music by extracting parameters from a users sketch and environment

Screen Shot 2016-01-08 at 2.35.17 PM

There have been various artists like Tom Phillips who have drawn to music. But what about people drawing music. I always imagined how would it be to draw something, and a musical tune being composed to the movement of my hand. What if my presence in space at that time altered the various parameters of this music.

The main inspiration of this project was to enable another outlook to music. To enable the user to draw and modify music. I choose to pursue this project as a part of the Music And Technology class I had taken up in Fall 2015 at UCSB.


This project was developed on Processing majorly by utilizing the minim library. The main functional units to develop this were oscillators and filters.The inputs from the touchpad and camera were taken and utilized to compose music as described above.

thumb_IMG_3778_1024


WORKFLOW

STEP 1

The first step toward developing this project was to develop the interactive app canvas. The user’s sketches are recorded by a stroke line following the cursor movement. Also with time the user experiences a fade out of the sketch.

Screen Shot 2016-01-08 at 3.05.32 PM.png

STEP 2

The second step was to generate audio and map the sketch properties to the audio. The audio was generated by using square and sawtooth oscillators. The frequency of these oscillators was derived from the real time y coordinate of the user’s sketch on the canvas.

Screen Shot 2016-01-08 at 2.35.49 PM

STEP 3

The third step was to introduce and design filters for the system. A bandwidth filter is made to work with the square wave oscillator and a low pass filter acts upon the sawtooth oscillator.

STEP 4

The fourth step was to derive the filter properties from the camera input of the user and the user’s surroundings. Precisely, the bandwidth of the bandpass oscillator and the cutoff frequency of the low-pass filter is derived from the camera input. Every y position of the user’s cursor on the canvas is mapped to a y pixel column of the input that particular time. The average brightness of the pixels in the y column is then evaluated and mapped to the frequencies.

thumb_IMG_3785_1024

FINAL OUTPUT