State Change for solo flute

Switching gears to an entirely acoustic piece, I recently finished composing a piece for solo flute titled State Change. The program notes read:

Many substances exist in different phases of matter or “states”. When external energy is applied, solids becomes liquids become gases, and upon the cessation of that outside energy a gas becomes a liquid becomes a solid. Notably, during some state transitions properties of the substance change discontinuously, resulting in abrupt changes in the volume or mass of the substance.

This piece explores the sonic analogy of state changes, modifying the parameters of the sound of the flute to transition between different textures (“states”).

My process for composing this piece was a new one for me. Taking inspiration from advice given to me by Tom Lopez (who mentioned that it was a compositional technique used by Morton Subotnick) I first created a “Parameter Map”, which included parameters such as pitch, volume, and “airiness” mapped over time, with the time dimension striated into 24 discrete segments. The solid black areas correspond to the value over time of a parameter.

State Change Parameter Map

image

I then took the parameters at the start of each of the 24 segments and represented them as the first note in each of 24 measures in a score with corresponding pitch, note length, volume, etc. (e.g. D4, quarter note, mezzoforte). I call this document a “Notation Template”.

State Change Notation Template

image

From here, my compositional goal was to “fill in” the measures, making sure that transitions between different parametric values (molto vibrato to no vibrato, for example) were smooth and interesting. This part of the process was the most intriguing to me. It gave me freedom to create interesting gestures while not having to worry about the overall trajectory: at all points of the process I knew the goal of the gestures I was creating and what trajectories spawned them. Put more colloquially, “where I was coming from and where I was going to” was known, because of the notation template and parameter map.

Ultimately, I ended up modifying some of the timings and parametric values, but the process helped me to create a piece and formal structure that I otherwise would not have created. My favorite moments in the piece are those that mirror the “abrupt changes in… volume or mass" alluded to in the program notes, times when the changing parameters interact momentarily to create a new texture for a moment before diverging.

State Change Final Score

image

I hope to continue exploring this and other experimental compositional techniques in future pieces.

Medium Mapping

Since 2009, I’ve been interested in exploring medium mapping, that is, using data/gestures from one medium (motion, for example) to generate gestures in, or modify parameters of, another medium (audio, for example).

This interest first embodied itself in a piece created for Per Bloland’s Advanced DSP class at Oberlin Conservatory titled Motion-Influenced Composition that used the OpenCV objects in MaxMSP created by Jean-Marc Pelletier to parse out gestures and scene changes in video data from a video camera that were then used to generate synthesized sounds in real-time. Shortly afterwards, I extended the piece to include a video component created in Jitter (mapping the audio medium onto video). A video of a performance can be found here.

(To digress for a moment, the video component ended up being very multipurpose: it was a stereo FFT spectrogram that colored different frequencies based on their timbre and had a lot of customizable parameters. A video of it in use can be seen here.)

I continued being interested in this idea of medium mapping in the years that followed, and did a good deal of work in film sound design (which could fall under the audio mapped to video, or vice versa, category). This past summer as an artist in residence at the Atlantic Center for the Arts I revisited the idea from a real-time perspective, and created the piece Transference, which is a recasting and refinement of some of the ideas present in Motion-Influenced Composition.

Transference again uses the OpenCV MaxMSP objects to get gestures and other information from a video camera, but uses Processing to create the video. A number of other differences exist. The sounds used in Transference are not synthesized (as they were in Motion-Influenced Composition), are instead samples of voices, and the video component is 3-dimensional rather than the 2-D video in Motion-Influenced Composition. As a function of the sound material being real-world and the video not being a direct representation of the audio (i.e., a spectrogram) there is a great deal more abstraction in the medium mapping in Transference than in Motion-Influenced Composition. Playing with the abstraction between mapped mediums is fascinating to me, and I hope to explore it more in the future. A video of a performance of Transference can be found at 40:32 in this video.

I have also written a draft of a scholarly paper on Transference and Motion-Influenced Composition that can be seen below.

Motion-Influenced Composition and Transference: Performance Experiments in Medium Mapping

First Post & "Industrial Revelations" Analysis

I have tried in the past to maintain a blog and failed. My goal with this blog is to post something interesting every day, be it of my own work, just an idea, or a link to some other artist’s/group’s work. 

The first project I’m going to discuss is an on-going analytical paper focused on Natasha Barrett’s “Industrial Revelations”, an eleven-and-a-half minute electroacoustic composition that is the last cut on her 2002 album Isostasie that I am writing for Ted Coffey’s seminar class here at University of Virginia.

I have been fascinated by the work of Barrett for many years (hearing her piece Racing Through, Racing Unseen on Miniatures Concrétes is what got me interested in acousmatic-style electronic music).

My goal in the analysis is to explore how the piece traverses the areas between the sounds of machines (predominantly trains in the piece), human sounds (voice, residual sounds of human actions), and the sounds of their respective (and at times overlapping) environment (organic, textural sounds).

I am using Pierre Couprie’s EAnalysis program to analyze the structure and source material of the piece. Here’s a screen shot of the analysis in progress:

image

Another plan is to transcribe the last minute of the piece onto traditional staff notation. I feel that pitched material in acousmatic music is rarely analyzed melodically or harmonically, and that the last minute of this piece is a particularly good candidate to do this with: spatialization is barely existent and all of the sounds have focused pitch centers.

I will be writing the paper through the rest of November.