FingerNeedle Released
25 Mar 2010I’ve finished the code cleanup and documentation for an instrument which I’ve announced before on sc-users mailing list. It was named as TouchNet, but I’ve decided to separate the “Touch sampler” and “Net” parts so now it’s called “FingerNeedle”. Download link is at the bottom of this post.
A video performance from the TouchNet days (with my FreeSound Quark), watch it in fullscreen please:
FingerNeedle is a gesture based instrument that is operated from a multi-touch enabled surface. In essence, it is a sampler instrument which converts sounds into square images and the performer plays rectangular portions from these images with touch. The images derived from the sounds give visual feedback to the performer about where to touch on the surface and what portion of the sound to use as a source. Unlike the standard waveform visualization, the performer is able to visually estimate the spectral characteristics of a sound and its change over time from the created image. FingerNeedle allows usage of several layers of sounds to be controlled and played back simultaneously and borrows the idea of “gesture recording” that is also present in my deQuencher instrument. The recorded gestures can be post processed to be slowed down and sped up dynamically.
The system is able to load uncompressed 16 bit mono sounds with any sample rate. A single sample in a 16 bit sound file can have 2^16 distinct values. The conversion system codes this range to shades of gray, where the lowest possible value is black and the highest possible value is white (loosely analogous with the grooves on a gramophone record), silence is pure gray, and a full amplitude sine wave is a continuous gradient between the shades of gray. Therefore, every pixel represents a single sample of a sound file. The samples are arranged inside the image in left to right, top to bottom order sequentially. This means that an image window with 800×800 resolution can contain 640000 samples (and pixels), which equals to approximately 14.5 seconds of mono audio at 44.1kHz sampling rate.
This visualization scheme allows one to predict the spectra as well as the dynamic range from the image before hearing it the first time. This visualization scheme aids the performance and also defines the interaction method with the instrument.
When a touch event is sensed, the instrument gets the blob size and multiplies its width and height with dynamically changeable multipliers. After that, a transparent rectangle becomes visible on the sound-image. The instrument loops / plays the highlighted portion of the sound from the preloaded buffer.
To play a rectangular region from a buffer, I’ve developed a unit generator called “NeedleRect” that gets x / y, width and height input and its output is used as an index for another unit reading from a preloaded buffer.
FingerNeedle currently requires:
- A MacBook / MacBook Pro with a multitouch trackpad.
- A recent copy of BatUGens from sc-plugins project which includes the NeedleRect UGen. (The current binaries listed on the sc-plugins site do not include this UGen. You’ll need to compile it from source.)
- BatLib Quark
- MultiTouchPad Quark
- And recommended for fun: FreeSound Quark. —
Download FingerNeedle from my github SCThings page:
http://github.com/earslap/SCThings/downloads
Please let me know if you try it and encounter any problems.