As most people who know me will be aware, I love technology, and I love music. The two have been intertwined in my journey, since I was a teenager. From the outside it might seem like a cold and expressionless way of making “sounds,” rather than music. I’ve recently come across a few videos that to me really articulate some of the magic that can happen when music and technology collide…
Technology has been a core part of my musical journey. In high school I remember hanging with my bestie Ashley at his house while we worked out how to make a midi recorder work (which I don’t know that we ever did,) or to get his Alpha-Juno making the sounds we had in our heads. At uni I began to learn how to program and code in earnest in response to Peter Gabriel’s Xplora 1 CD-ROM. I was playing with Cubase on an Atari ST to take my next steps in sequencing, and working in the studio recording, or with early versions of Pro Tools. And once I started working, buying my first legit copy of Logic (when it was still in Emagic’s hands) and running that in my apartment to start the process that led, eventually, to Fuzu.
These videos (below) really resonated with me, as a way of describing and demonstrating how under the right conditions, the combination of code and machines and human expression can transcend those individual components.
The first is an interview with electronic musician and composer Trent Reznor (of Nine Inch Nails fame) about the Moog Voyager synthesizer:
There are so many great bits in this, but a few standouts:
At around 9:20, in relation to his experience scoring for film, he notes the connection to emotional expression that certain instruments can have:
…how I emotionally respond to them versus specs or every possible option in depth. You know, the luxury having a great room like this filled with cool stuff in it that I have a pretty good working knowledge of how each piece makes me feel, you know, and as that thing comes up I kind of want to get this feeling out and I think it should feel this way, I should feel a little insecure and it should be a little warbly but it shouldn’t feel happy and it should be a little bit pissed off. What is that?
Later, at around 9:50 he says:
[I] realise how difficult it must be as a designer to put the right stuff in there, in the right box with the right feel. Because it doesn’t matter what it looks like, you know? What that feels like, and that’s an achievement, you know, to elevate it, bringing to give it life rather than just “it’s a blueprint that someone did a nice job engineering.” It feels like something I want to sit down and engage with it because I’m not just writing code on it, you know? I’m translating or instrument on some translates your emotions that someone else might experience and relate to. It does a great job. It is the archetype of what to me is what a synthesizer is and should be.
The crux of it is, as he relates around 2:50: “[The Mini-Moog] felt like a musical instrument, not just a collection of circuitry in a box.”
I love these quotes. When I think about sitting in front of the computer to write or record music, I don’t want to feel like I’m in front of a computer. I want it to be a creative process.
At a base level, that’s having a stable system—not installing drivers or updates or dealing with plugin conflicts and crashes and all that (which is in part why I am Mac OS-centric for music). But there’s another layer, another level, which Trent calls out here. It’s the musicality—something that can convey the emotions you are trying to express or capture.
But that comes from another layer on top of that—which is the design of the thing—that it feels like an instrument. That’s about leaving a lot of things out. About making smart choices around the parameters that are fixed. In the instance of the Moog gear, it’s the physicality of it—the feel of the controls and the keyboard. But that applies just as much to software. I think this is one of the reasons I like Ableton Live so much—it really gels with how I work and invites me to experiment and play.
As a designer of software, this is the lofty goal I strive for, but not often achieve. That satisfaction of the feel of something being just right. Intuitive. Expressive.
Another inspiration point was this performance by Dan Tepfer at the NPR Tiny Desk (a series that I thoroughly recommend sooooo much good stuff):
So often the idea of technology in a live context seems to be viewed as “cheating,” just being used like a recording, running in the background. Dan’s performance is wonderful, and his explanations as to what’s going on beautifully demonstrate how the technology can be both an inspiration point and interactive element in a live performance.
He writes programs that follow simple rules to play the piano with him, at the same time, in response to what he’s playing. He, of course, then responds in kind, taking things in different directions based on what the computer is playing. This extends to the visuals, too, which are spellbinding. The fact it’s a real grand piano, too, just adds to the expressive juxtaposition of technology and “real” instruments.
The connection between music and math is well known, as is the connection between math and programming. Dan’s is a wonderful demonstration of all of these elements coming together to wonderful creative effect.
And another: I’ve seen Imogen Heap’s Mi.Mu Gloves before. Given my love of digital making (I’m nearly positive there’s an ESP8266 in the mix somewhere) I was intrigued to see this “behind the scenes” demonstration of the capabilities of the gloves, and, more importantly in my view, how Imogen uses them to connect physical expression (the hand’s movements) with musical expression.
I also love how she’s developed the software that connects with Ableton Live to learn your gestures, to program what MIDI elements are controlled and how. The way she uses a gesture to “throw away” a loop if it doesn’t work so effortlessly and subtly is brilliant. The way she can trigger a reference tone into her in-ear monitors for pitching the opening notes of an a cappella piece, the use of her “hidden” gestures in her fingers to change the meaning of lifting her hand from one parameter to another. Just brilliant. I don’t know how she remembers all of the different gestures song-to-song, but I suppose it becomes muscle memory soon enough, and just another part of the physical expression that naturally comes with performing a piece of music. Magic.