Touching Sound

Before I bought my iPad I studied the landscape of (what I term,) “IOS music-making.” My iPhone provided me with a little bit of a taste. Going into what amounted to a new chapter in my sound art I remember reading someone’s counsel, ‘Don’t overwhelm yourself with apps because there are too many and you’ll want to learn how to use a few of them.’

How uncongenial. Usually I learn enough about whatever instrumentality to which I’m attracted and then I go do some damage to the conventions certified by more learned, disciplined, and mature practitioners. I do it this way because it’s enjoyable and provides for surprises.

I’ve been enjoying the possibilities the iPad’s capabilities unleash.

The A5X processor that currently powers the iPad 3 is system-on-a-chip — SoC — part that combines a dual-core CPU at 1 GHz and a quad-core PowerVR SGX543MP4 GPU into a single package. Intel Wants to Be Inside the iPad.

Consider this: the A5X processor is at least four times faster than the Apple G3 I centered my original sound design studio on fourteen years ago, even if this is necessarily an apples and oranges comparison. From the viewpoint of this sound designer, the ability to use the screen as an element of the interface is the key to the kingdom.

The depiction of the first app page showcases my current top ten apps rated for how much of my attention each has roughly captured. The TC-11 and Orphion apps top the list because of how each leverages the screen for the purpose of manipulating sonic output

Tomorrow a two part piece drops at Kamelmauz Bandcamp Soundz that uses Drone FX. Although this app is configured to generate ambient soundscapes based in letting five different sample maps run on their own, it also presents nine touchable keys for each map and sample set, so I’ve been using Drone FX as a controller.

At the level of apps MIDI and inter-application sample/file operations seem to advance every month as apps announce upgrades. Workflow remains messy. Luckily, I’m old fashioned and consider the iPad–for my purposes–to be an “audio out” device. This means the output goes into Logic, the so-called DAW, and coordination challenges thus are greatly diminished.

What’s really exciting is future interoperability. I’ve been playing around with using Orphion to control Animoog and ThumbJam. This has given me a taste of how the unique elements of different conceptions can be brought together to create a ‘syncretic’  tool with a unique whole at least equal to the tool’s innovative parts. This is all about the touch and touching sound.

Leave a Reply

Your email address will not be published. Required fields are marked *