Put the audio files into a playlist in the music player app
Turn on the Shuffle mode to randomize the playback order of the files
Place the robot vacuum and its charger on the stage
Score example and explanation
Cue #
Low Energy / Slow / Quiet
High Energy / Fast / Loud
In each cue, the violinist chooses a phrase from 4-6 options. If the sound emitted from the robot vacuum seems to be in the low energy/slow/quiet category, choose a phrase from the left column. Choose a phrase from the right column if the vacuum sound fits the high energy/fast/loud category.
The following determines the cue duration.
Duration of the audio file. Go to the next cue if a new audio file plays.
Time for the vacuum to take a turn. If the vacuum changes its direction, go to the next cue.
Violinist can signal the tech operator to go to the next cue
The regular bar line in the notation indicates the duration of the cue. If a cue lasts 30 seconds, play the phrase between the regular bar line for approximately 30 seconds
The repeat bar line in the notation indicates that the phrase should be repeated until the next cue.
Improvise the pitch and rhythm according to the note shapes and positions
Notes within the bar line: play mid-range notes
Notes above the bar line: play high-range notes
Notes below the bar line: play low-range notes
Note durations: interpret according to the current cue
Whole notes and half notes: very slow and slow
Quarter notes: not fast nor slow
Eighth notes: fast
Glissando: notated as a line connecting two notes
Rectangular note heads: play airy, noise-like yet-pitched tone
X note heads: mute strings
Performance Instruction for the Tech Operator
The Tech operator’s main job is to cue the robot vacuum and the audio player.
If the audio player plays the next track, go to the next cue by changing the vacuum’s direction
If the vacuum turns to a different direction before the end of the audio file, go to the next cue by skipping to the next audio file on the audio player.
At the performer’s signal to go to the next cue, change the direction of the vacuum and skip to the next audio file.
Note that the beginning of the piece (cue #0) is a vacuum solo. The audio file starts to play at cue #1
Note the specific sequence of turning off the vacuum and the audio player at cues #28-30
End Credits, a fixed media composition included in the Fan Art album, is documented in five sections. The first section, Program, is the program note to be included in a concert booklet or album promotion. I share information and thoughts that may help listeners to appreciate the music better. The second section, Form, is for the creators who want to learn how I used electronic instruments to create a complete piece. The third section, Code, is for the technologists who wish to understand how I coded the piece. Musicians familiar with code-based apps like SuperCollider and Max will benefit from analyzing the code. In the fourth section, Inspirations, I share why I chose to write the piece. The content is too personal to be on the program. The last part, Uniquely Electronic, is a bonus section featuring sounds and ideas I could not express in an album format. Fan Art is available as streaming stereo tracks, but they are originally designed for multi-channel sound installation. The last section provides resources to realize songs in Fan Art at full capacity.
End Credits is an algorithmic composition based on the harmonic progression of Debussy’s Clair de Lune. The SuperCollider code written for the piece generates notes with unique overtones, and the overall sound reminds me of organ music at viewings. My friend and I joke about writing each other’s farewell music, and I got one for him now. If he doesn’t like it, I will use it as my exit theme.
End Credits uses the harmony of Debussy’s Clair de Lune. The downloadable SuperCollider code, EndCredits.scd, makes sound according to the following instruction.
Choose a list containing all notes present in measure x.
Scramble the order of the notes.
Play notes at random timing. There is a 50% chance of two notes being played simultaneously.
When all notes in the list are used, move to measure x+1.
Repeat steps 1-4 in slow tempo (quarter note = 3.6 seconds). End Credit uses harmonic progressions from mm1 to mm27 of Clair de Lune.
EndCredits.scd code also generates each note according to the following instruction.
Make a sine tone with randomized slow vibrato and tremolo using two LFOs. Randomize amplitude envelope, LFO rate, LFO amount, and pan positions.
Make a single note by combining 5 sounds made in Step 1. Then, randomize each note’s positions, frequencies, and pan positions to make a slightly detuned note with a wide stereo image.
Make a single note with a random number of overtones using the note generated in Step 2. The note’s duration is also random but is almost always longer than a quarter note (3.6 seconds).
The resulting sound is an imaginary organ capable of changing the stops at every note. The instrument also seems to have multiple sustain pedals.
The compositional objective of making End Credits is akin to minimalism. I wanted to create a simple process that yields unexpectedly delightful sounds. So I simply made an ambient piece using additive synthesis and traditional harmony with computer-aided instructions. There’s no new technology or concept, but we create new sounds by combining old ideas.
Code
EndCredits.scd, the SuperCollider file I made to generate the album version of End Credits, has the following sections.
SynthDef (“Cell”): makes a sine tone with controlled random values
~note: make a note event using SynthDef(“Cell”)
~chords: an arrayed collection of note numbers representing harmonic contents.
~event: play one measure using ~note with pitches in ~chords
SystemClock: play the music
To make sense of this section, open EndCredits_Analysis.scdon SuperCollider and refer to the code while reading the next sections. The analysis .scd file has simplified working codes.
SynthDef(“Cell”)
End Credit uses one SynthDef featuring two LFOs, one ASR envelope, and one stereo sine tone generator. By playing two instances of SynthDef(“Cell”) with slight pitch differences, we can make a simple detuned sound. The codes in section //1. “Cell” without randomness shows the simplest form of the instrument. The actual SynthDef used in the piece is in //2. “Cell” with randomness. It applies ranged random values to give varieties in amplitude envelope’s attack and release time, LFO’s frequency, phase, amplitude, and pan position.
~note
~note is a function with the following parameters.
~note.(pitch in MIDI number, duration (sec), volume (0-1), number of overtones);
By providing a number for each parameter, ~note creates a tone with a varying number of detuned overtones. As we can observe in //3. ~notes, loops (.do) and ranged random number generators (rrand) were extensively used. Try to run the following line on SuperCollider to hear the difference. Notice that the sounds are not identical when the codes are re-evaluated.
~note.(50,10,0.3,1); //no partials
~note.(50,10,0.3,5); //some partials
~note.(50,10,0.3,10); //many partials
~chords and ~event
~chords is an array of interval values representing the notes in a measure in Clair de Lune. As evident in //4. ~chords, method .scramble is at the end of every measure to randomize the sequence order. For analysis purposes, only three lists are inside ~chords.
~event plays ~notes according to the pitch choices in ~chords. One measure in End Credit is thus generated with the following parameter
~event.(measure index number, note duration factor, amplitude, overtones, harmony probability (0-1.0))
As we can observe in //5. ~event, the note duration factor is a multiplier for each note’s duration. The larger the number, the longer the note duration, resulting in a sustain pedal-like effect. The overtone amount also gets a slight randomization for variety. The last parameter, harmony probability, can control the chance of the following note being played simultaneously.
SystemClock
Codes in //6. SystemClock is responsible for putting everything together to produce audible sounds. The section uses Routine to make ~event go through all the measures provided in ~chords. SystemClock provides a 2-second silence, a little pause before listening to everything.
Inspirations
I composed End Credits n February 2021, the end of strict COVID-19 isolation days. I must have been listening to Debussy recordings often to keep myself together. Clair de Lune’s lowest note, Eb in mm15, felt like the most beautiful piano note then. The timing was perfect, and it resonated with the piano’s body and the listener’s mind. I wanted to recreate that in an electronic music context. So, I created a sound that imitates the slight detune of the low range of the piano. Then, perhaps due to COVID blues, I instructed the computer to play those sounds in blurry and slow motion.
Uniquely Electronic
End Credit has two playback modes: ~onetime or ~infinite. The one-time version with a fixed duration (8:26) is available on major distribution platforms for listening, and any media player can play it. The installation version goes on indefinitely with varied timbre, timing, and duration. The listener will need to runEndCredits_infinite.scdon SuperCollider. Let it run for hours on solemn and not-so-happy occasions!
No Remixes will release my new album on January 20th. I named the album Fan Art because each track is a tribute to things I fanatically admire – Bach’s Prelude in C minor, Debussy’s Clair de Lune, traditional Korean rhythms, no-input mixing, palindrome, ii-V-I progressions, and synthesizers. The way I expressed my fandom could be excessive, too personal, niche, and inappropriate, but those are what make fan art great!
After the album release, I will write a blog for each piece in CMPE style. Revisit my social media to learn algorithmic composition and sound design!
I made a list of my published and available works using Google Sheets. Click HERE to view. The work list already exists on joowonpark.net, but the HTML format is difficult to sort, analyze, and assess. A catalog in spreadsheet format allows me to revise and manage the pieces with much more ease. The current file has the following information per piece:
Title
Year Published: It is not the year the piece was composed but when it was available to the public.
Instrumentation: Instrument names are alphabetized.
Album: If the piece is a part of an album, the album title is available.
Co-Creation: Indicates if someone contributed during the composition/production process. Examples include co-composers, co-producers, choreographers, and theatre directors, but not performers.
Notes: Miscellaneous info. It could extend to another column in the spreadsheet.
Organization Principles
There are 121 pieces listed in the catalog as of December 2022. The number is more than the entries in my BMI catalog, which is currently 98, for the following reasons:
The Google Sheet catalog includes sound installations, recordings of free improvisations, and web projects that are tricky to register as compositions.
Some are dance and theatre collaborations that require extra paperwork to register in BMI (feel free to correct me if I am wrong)
I also had to decide how to catalog 100 Strange Sounds and CMPE. The two were multi-year projects consisting of many short pieces. I had to choose to count them as two long compositions or 132 separate pieces.
100 Strange Sounds counts as one piece. It is a mosaic of individual entities with a common goal and theme, like Ik-Joong Kang’s Happy World.
The most challenging part of cataloging was deciding what not to include. My principal guideline was whether the piece had online documentation.
Completed works not presented to the public are not listed.
Published works without links are not in the catalog. Examples include
Premiered works without video or audio recording
Premiered works with missing concert programs, recordings, or scores. Most of the compositions during my graduate school years are in this category.
Published works in a DVD or CD format only. My first published work is in this category.
A temporary exception is the pieces in the Fan Art album. Fan Art will be released in January 2023, and thus I put them into the catalog as placeholders.
Short Analysis
My productivity increased to a satisfactory level after graduate school. The number of presentable pieces per year was the lowest when I had the most time to work on music as a graduate student. Conversely, my productivity peaked when I had the least amount of time. I had a newborn baby and a first-grader to raise in 2015, but I released an album and was working on the next one. I accept that the pieces I made before 2009 were not good enough to get into conferences and invited shows. Also, I started using platforms like YouTube to facilitate sharing and documenting work after finishing school.
The number of album releases and large-scale works indicates that I work well with long-term projects. Creating an album with a theme keeps me in creative mode. Multi-movement electroacoustic work is fun. A single piece that requires an extended amount of time to gather sources or produce sounds positively challenges me. I will consider my experience in planning and executing long-term projects as my strength and continue developing it.
Co-creation accounts for about 30% of my creative output. I thought I preferred working alone, but the pi chart says I am not bad at collaborations. I am confident that I can tackle bigger projects involving multiple personnel in the future.
There are a few things to improve in the catalog. I want to record each piece’s duration to compare the effort I put into each work. I can also list performers who premiered the piece. The number of performances per piece can also be pertinent data for the catalog. Analysis of such data can show me where to concentrate my creative energy for the next few years.