Category Archives: Research

Endorsement – SuperCollider for the Creative Musician

I wrote a book endorsement for SuperCollider for the Creative Musician by Eli Fieldsteel

SuperCollider for the Creative Musician teaches how to compose, perform, and think music in numbers and codes. With interactive examples, time-saving debugging tips, and line-by-line analysis in every chapter, Fieldsteel shows efficient and diverse ways of using SuperCollider as an expressive instrument. Be sure to explore the Companion Code, as its contents demonstrate practical and musically intriguing applications of the topics discussed in the chapters.

The endorsement had a word count limit. This book deserves a more detailed review. I agree with Fieldsteel’s statement in the Introduction that the book is a  “tutorial and reference guide for anyone wanting to create electronic music or experimental sound art with SuperCollider.” Musicians, media artists, and programmers will learn the fundamentals and practical applications of SuperCollider by reading the book from cover to cover. I especially recommend this book to musicians seeking the connection between creative coding and their artistic practice. Electronic musicians learn to express musical ideas in numbers and symbols when they code music.  Coding trains users to think of music differently as a result, and the author does an excellent job of teaching how to do so. 

Fieldsteel’s expertise in composing, performing, and teaching SuperCollider for over a decade is evident in every chapter. The author correctly anticipates common beginner challenges and provides the most efficient solutions. I love Tip.rand sections dedicated to troubleshooting and debugging. They are essential in increasing productivity and decreasing the frustration of learning a new environment. The book’s biggest strength, as demonstrated in Tip.rand, is its accessibility. The language, style, and examples do not assume that the readers have previous programming, music synthesis, or audio engineering experience. Included figures, tables, and example codes are also effective and pedagogical. I was happy to see that the printed codes’ font is identical to the default font of SuperCollider IDE.  It reconfirms the author’s effort in creating inviting chapters to learn a language with a considerable learning curve.  

I spend the first month of my SuperCollider class helping students overcome the initial steep learning curve. The book will dramatically reduce the time and frustration of going over that hump. I don’t think other existing SuperCollider resources will help as much as Fieldsteel’s book for that purpose.

How to Play a Solo Set

I often prepare a set for solo show opportunities. A set is a performance practice of playing multiple pieces without significant pauses (i.e., no “set changes”). It is often long (30+ minutes), and the works presented within have a common theme or instrumentation. The ability to perform a solo set is helpful, if not essential, to electronic music performers in getting gigs and collaborative projects. A DJ set at music festivals is a good example of a set performance. 

I played a set consisting of seven original compositions at the 16th Strange Beautiful Music (SBM) Festival in September 2023.  I will use the recording of this particular set to show how I organize a 40-minute set. I hope the readers get a macro and micro-level insight into an electronic set performance, especially when read together with my analysis of solo set gears.

I go through four preparation steps for a set performance.

  1. Decide pieces
  2. Decide the order
  3. Practice transitions
  4. Practice sound check

I will explain the details of each step in the subsequent sections. Please refer to the chart below to see the overall timeline of the SBM set. The chart lists the estimated starting time and instrument used in each section.  

Link to Google Sheet version of the chart

Decide Pieces

The selection of pieces depends on external factors I cannot control. Examples are the total duration decided by the organizer, sound check time, and the venue’s equipment. Once I learn the external factors, I decide which pieces to include in a set. I was invited to perform for 40 minutes for the SBM at Andy Arts Center’s Hanger.  I had one hour of tech time with an excellent audiovisual team. Given this information, I decided to play the following pieces.

  • Scramble and Sort (2023) – for computer and drum machine
  • Page Turner’s Agony (2021) for computer and MIDI controller
  • Gums (2013) – for no-input mixer
  • Cobalt Vase (2019) – for drum machine
  • Func Step Mode (2019) – for no-input mixer and drum machine
  • Toccata (2009) – for computer and contact mic’ed objects
  • Elegy No 2 (2017) – for computer and melodica

Each piece in the set features uniquely electronic sound and instrument. All pieces involve improvisation, so the audience hears an event-specific version of the piece. The SMB performance also included a world premiere of Scramble and Sort – Adding an artistic risk prevents me from potential practice fatigue.

Decide the Order

The order of the pieces in a set should be carefully tweaked for seamless transitions between the pieces. A well-thought-out sequence of compositions keeps the audience engaged as well. My theme of the SBM set is to show multiple electronic instruments in different contexts, so the order of revealing different instruments and styles was a priority. I opened the set with Scramble and Sort as an appetizer – music with an easy-to-digest rhythm and the familiar sounds of a drum machine. The following two pieces, Page Turner’s Agony and Gums, had more abstract and timbre-based electronic sounds, featuring a MIDI controller and no-input mixer. Then, as a main course, Cobalt Vase, Func Step Mode, and Toccata feature an unconventional combination of familiar instruments. The sounds of these pieces were most aggressive and noisiest- they were not appropriate as opening pieces. As a palate cleanser, I ended the set with Elegy No. 2, a slow and minimalistic piece featuring melodica.

The visual elements are also a factor in deciding the order. One person playing an instrument with relatively little flexibility for an extended period is not exciting by default. I try to improve this by introducing new instruments for each piece. In the SBM set, a drum machine, no-input mixer, found objects on contact mic, and melodica were sequentially introduced. The above chart shows the order and amount of appearance of the instruments for a better insight. 

Practice Transitions

Smoothly connecting one piece’s ending to the next piece’s beginning is a concept I do not consider when composing a work.  But I am responsible for making the transition musically satisfying in a set. I want the audience to enjoy the process of timbral and stylistic changes in gradual motion. 

In music production, a crossfade function gradually transforms one sound into another (it is the equivalent of the cross dissolve in movie editors). Most of my practice for the set focuses on devising and practicing live crossfades. Because the order of the pieces could be unique to each gig, the crossfades are unique to each event.  I consider crossfades of significant length (2-3m), as heard five times in the SBM set, mini event-specific compositions. 

Practice Sound Check

Setting up and striking the gear should be part of the practice and preparation. A solo set often needs a suitcase full of cables and instruments. There is no time to think about signal flow during soundcheck.  It takes me a few hours to test all the gear and figure out the optimal configuration for every gig. When the configuration is finalized, I practice setting up and tearing down the equipment. I aim to be ready for the sound check within 30 minutes of arrival at the venue.  

Being as self-contained as possible in terms of gear increases efficiency. I packed all the gear and bought a folding table for the SBM set. The less time I spend on finding the right table and setting up the gear, the more time I can use during the allotted sound check to troubleshoot and tweak the sound for the room. For the SBM performance, I forgot the box of toys I use for Toccata at home. So, I finished the sound check early and picked up rocks, bolts, and other objects in the venue’s parking lot. I hope no one noticed my mild panic before and during the permanence.

Outro

Organizing and presenting a set is a skill that helped my career as an electronic music performer. A well-practiced set is suitable for tours, guest lectures, and festival performances for its efficiency and flexibility. Many collaborative opportunities came from meeting musicians and dancers from presenting in this format.  Audiences also experience music they heard from phones and computers in more intimate and focused contexts. As for artistic growth, curating a set allows me to improve and reimagine the existing works. Every set performance is a practice for a future show. Sometimes, transitions become seeds for new compositions. 

A Musician’s Productivity in Numbers: a case study in an interdisciplinary show

Introduction

How much time and energy does a musician spend on an interdisciplinary project? A three-month-long production period does not equal 90+ days of labor. How many days are spent on music, and how many of them are spent on meetings with the collaborators? How much of the music created for the show ends up in the show? I ask myself these questions to better understand the practical role of a music creator in a project involving performing artists of other fields. Answers to these questions require measurable data, such as the total working days and total minutes of music composed. These numbers lead to an insight into the productivity of music creators.

In 2023, Artlab J, Detroit Puppet Company, and I created a one-hour show titled Objects at Play (video link). It was a non-verbal dance and puppet show aimed at young audiences. The first meeting was on February 18, 2023, and the show premiered on May 27, 2023, at the Detroit Film Theatre. I recorded my production process from the start of the project to study my collaboration productivity.  I gathered and organized the data according to the numbers I worked and the minutes of music I produced as I composed, recorded, and mixed music. The analysis and statistics revealed that a fraction of the total collaborative period is spent on person-to-person interaction. About two-thirds of the total music communicated with the collaborators ended up in the show. 

There are three limitations to this article. 

  1. I am sharing my work process as a solo electronic musician who could compose and share music without other musicians. The workflow described in the following section may not apply to performers or composers of non-electronic genres. 
  2. No similar data were collected from the collaborators of Objects at Play. Comparison of productivity across the discipline was outside the plan. 
  3. The analysis focuses on the practical aspects of collaboration. There will be no aesthetic discussion of Objects at Play

Data gathering 

I used a production diary consisting of a web folder with a session log, screenshots, and photos to record the project’s progress. In the SessionLog text file, I briefly described the work I have done in a workday.  Each entry has links to photos of hand-written notes or screenshots of the hard drive folder containing music files used for the project. The screenshots function as a reminder of content changes in music tracks. The Old Versions folder in the screenshots contains obsolete or rejected session files. I kept these files to calculate the amount of music that had not been used at the premiere.

Work Routine

The creative team of me, choreographer Joori Jung of ArtLab J, and theater director Carrie Morris of Detroit Puppet Company shared a Google Drive folder for remote communication and file transfer. The team worked on multiple projects, so daily or weekly meeting was not an option. The list below shows how I worked on the project in this context as a musician.

  1. The first in-person meeting with Joori and Carrie was on 2/16.  The three discussed the overall vision of the piece. 
  2. After meeting #1, I worked on short and independent tracks that could match the to-be-developed scenes. 
  3. I shared nine music tracks with the collaborators via Google Drive before the second meeting.
  4. At the second meeting on 3/9, Joori and Carrie shared their work-in-progress scenes. The directors also shared current music tracks-to-scene placement. 
  5. After meeting #2, I made five additional tracks. I also revised and expanded the tracks used in the scenes.
  6. I shared the updated tracks with the collaborators before the third meeting.
  7. At the third meeting on 4/25, the directors shared new tracks-to-scene placement. The deadline for the final version of the music was set. 
  8. After meeting #3, I made three additional tracks. I continued revising and mixing the tracks to a presentable form.
  9. I delivered the final versions of the tracks. The directors and performers continued working on the project until the premiere on 5/27, but I did not create more music for the show.
  10. Separate from the theater premiere, I worked on a 14-track album with edits suited for audio-only release. It was published on Bandcamp a day after the premiere.

Note that I had the aesthetic decisions in creating music, but the directors in charge of movement and stage decided the music’s length, order, and selection. Unlike solo projects, the decisions that drove the project forward were not mine by design. 

Data Organization

I organized the information in the production diary into two categories. The first category traces how allocations of the music track to one of the seven scenes change after the collaborator meeting. The second category is statistics on days worked and the amount of music produced. 

Tracks-to-Scene Organization

Figure 1 shows how each track I made and shared with the collaborators changed their use throughout the project. The blocks with letters A to Q represent 17 tracks with independent musical themes. I composed the first nine drafts after the first meeting, five after the second meeting, and three more after the third meeting. These tracks were available as separate mp3 files on Google Drive for the choreographer and the theater director.

<Figure 1>

The middle column represents the tracks-to-scene assignment after the second collaborators’ meeting. Four scenes needed new music. The two scenes required a combination of tracks. All tracks needed expansion and revisions in terms of the music’s length and formal development. Note that four out of the nine tracks shared before the second meeting were rejected.

The right column represents a revised tracks-to-scene assignment after the third meeting. It became the final version. Some tracks included in the previous version, such as tracks A and L, ended up being excluded from the show. All but one rejected track after the second meeting came back as a part of Scenes 5 and 6. Track M changed its function from the theme of Scene 6 to the finale of Scene 5. 

Productivity Analysis 

I measured the amount of work by the days I spent on the project and the length of music created and shared with the directors. There are 102 days from the initial meeting on 2/16 to the album premiere on 5/28. According to the production diary,

  • I worked 37 days on this project  (36.3% of total project days).
  • I met with collaborators in-person for 3 days (8.1% of the working days, 3% of total project days).
  • I did not record the minutes I worked on each day, so I cannot calculate the hours I worked on.

In terms of the total amount of music, I gathered the following from screenshots and project files.

  • 14 out of 17 tracks made it to the show (82.4%).
  • The total amount of music communicated with the collaborators was 11474 seconds consisting of 34 drafts (figure 2).  
  • The premiere used 3319 seconds of music (figure 3). That is 28.9% of the music communicated with the collaborators. 
  • The project used a total of 14.1GB in the hard drive. The files were Logic Pro sessions, SuperCollider files, and audio recordings of me playing a melodica. 

<Figure 2>

<Figure 3>

Interpretation of the Data 

The collaborative process is about quickly adopting and adapting to changes. My role as a music composer was to react to the developing dance and puppetry. It meant constant addition, elimination, and revision. 14 out of 17 tracks making it into the final version looks like a satisfactory rate, but it is less than a third of the music shared with the collaborators. At the same time, once-rejected music can become useful if the circumstances change. Keeping the Old Versions folder intact was a strategically right decision. 

I worked on the project for about a third of the total project period and waited for collaborators to develop their part asynchronously to my music production schedule. Waiting is part of the process for musicians in interdisciplinary projects. It is possible to have time to work on a separate project while engaged in a long-term collaboration. 

Notice that I did not discuss budget and fees in this article. The amount of time and energy spent on a project does not account for the creator’s previous experience and skill. 30 days I spent on Objects at Play could have been 90 days of work for some or 10 days of work for others. My productivity analysis is not a suggestion for budgeting or calculating artist fees. Its object is to be a reference for a better collaborative practice.

847 Twins – Brief Analysis

The production of 847 Twins, the title track in the album Fan Art, is documented in four sections. The first section, Program, is a one-paragraph description of the music written for a concert booklet or album promotion. I share information and thoughts that may help listeners enjoy the music. The second section, Form, is for the creators who want to learn how I used electronic sounds in composition. The third section, Code, is for the technologists who want to learn how I designed the piece in SuperCollider, a code-based audio app. Links to the code are available here. The last part, Anecdote, has extra narrative relevant to 847 Twins but is optional to enjoy the piece.

If preferred, read this article in PDF format.

Program

847 Twins is a two-movement piece based on harmonic progressions of Prelude & Fugue in C Minor by J.S. Bach. An electronic remake of Bach is a well-known practice pioneered by Wendy Carlos and Pierre Schaeffer (Switched-On Bach & Bilude). I learned so much from reading and listening to their works. J.S. Bach is also my hero composer. Therefore, it seemed appropriate to dedicate a song to my musical cornerstones in an album about fandom.

Listen to the tracks linked below before reading the next sections.

The tracks are available on other major platforms at  https://noremixes.com/nore048/

Form

Mvt I. Pluck

Pluck and Blip, the two movements of 847 Twins, algorithms written in SuperCollider use the harmonic progression of the Prelude in BWV 847. The downloadable code, 847_Pluck.scd, generates randomized voicing patterns played by a guitar-like synth. Below is a step-by-step explanation of how the composition process. 

  1. Design an electronic string instrument. Each note of this instrument is detuned at a different ratio every time the string is “plucked.” The note’s duration, dynamic, string stiffness, and pan position also vary randomly. 
  2. Using the instrument in Step 1, strum a chord with notes at a measure in BWV 847. Unlike a guitar, a strum of a chord can have multiple pan, accents, and note durations due to the randomization in Step 1.
  3. Each measure of BWV 847 is played four times before advancing to the next measure.
  4. Add a bass part with gradually increasing loudness. It plays the lowest note in the corresponding measure. 
  5. Add the intro and the outro for a better form. They are not quoted from BWV 847.

In short, the first movement of 847 Twins is a reinterpretation of BWV 847 featuring an imaginary string instrument and a synth bass. I loved how Bach created exciting music with a predictable rhythmic pattern. The key was harmony and voicings. I wanted to emphasize that aspect with an additional layer of dynamics articulations in Pluck. The added bass line, which imitates the “left hand” of basso continuo, fills in the low-frequency spectrum of the piece. The bass part is best experienced with a headphone or a subwoofer. 

Mvt II. Blip

The first movement lacked elements of counterpoint, so I tried to make an electronic polyphony in the second movement. In Blip, each measure has 3-6 parts playing different phrases derived from a measure in BWV 847. The phrase shape, the number of voices, and articulation are determined randomly at every measure and create a disjunct yet relative form. Schaeffer’s Bilude explores this idea by combining piano performance and recorded sounds.   

Below is my process of creating a random phrase generator. Please run 847_Blip.scd to hear the piece.

  1. Create a list of pitch sets by reducing repeating notes in each measure of BVW 847.
  2. Make three different synth sounds.
  3. Make a phrase generator that uses the list in Step 1 and synths from Step 2. The instrument choices, phrase length, note subdivisions, and articulations are randomized. The SuperCollider code also has the option to generate a rhythmic variation (i.e., insert rest instead of a note). 
  4. Make a polyphony generator that spawns the phrase generator described in Step 3. The number of polyphonic voices and their octave transpositions are random. 
  5. Play and record Step 4 twice. Then, import the tracks to a DAW. Insert a reverb plugin on one track. The reverb should be 100% wet. 
explanation of analysis
explanation of analysis

The algorithm described above creates different timbres, polyphonic patterns, and the number of voicings at every measure. Furthermore, every rendition of the SuperCollider code makes a unique version of Blip. One measure can be a duet of two-note phrases, and the following measure can be an octet of eight phrases played in a four-octave range. The room sound created by the DAW reverb plugin doesn’t reflect the source, but it sounds similar enough to be heard as part of a whole. 

Code

Mvt I. Pluck

The SuperCollider file for Pluck consists of seven parts. Please download and use 847_Pluck_Analysis.scd to hear and modify each part. Make sure to run the line s.options.memSize=8192*16 to allocate enough memory. 

  • SynthDefs: SynthDef(“Gtr”) uses a Karplus-Strong physical model with controllable pan, frequency, stiffness, amplitude, and duration. SynthDef(“Bass”) makes a sinusoid tone with a percussive amplitude envelope. The UGen Lag.kr smoothens the sharp transient of the amplitude envelope. 
  • ~onenote: this function uses two SynthDef(“Gtr”) to create a detuned note. The amount of detuning is randomized along with other parameters of the SynthDef.
  • ~stroke: this function creates instances of ~onenote with pitches specified in the ~chords array.  ~chords is a collection of all the notes in the Bach Prelude, categorized and indexed by measure number. The order of the notes in a measure is random.  ~stroke plays the chord in sequence or reverse to simulate a guitar’s up and down stroke motions. 
  • ~strums: this function continuously triggers ~stroke. The global variable ~pulse determines the tempo. ~strumsend function is used once for the ending. 
  • ~clock: this function changes the chord progression at time intervals set by the global variable ~mdur. It also changes the parameters of ~strums by altering the values of global variables ~mm~accent~volume~notedur, and ~stiff. Note that both ~strums and ~clock functions must run simultaneously for a correct chord progression. 
  • ~bassline: this function plays SynthDef(“Bass”) a few seconds after the start of the piece. It uses the if condition to change the rhythmic pattern. The line pitch=~chords.at(count).sort.at(0) picks the lowest note of each measure as a bass note.
  • SystemClock: this scheduler syncs ~strums, ~clock, and ~bassline to play a version of Pluck. Every rendition of SystemClock will make a new variation of the track.    

Mvt II. Blip

The SuperCollider file for Blip consists of four interconnected parts. Please download and run 847_Blip_Analysis.scd to hear each part.

  • SynthDefs: The three SynthDefs, PBeepTBeep, and SBeep, are all slightly detuned percussive instruments featuring a classic oscillator waveform, such as sine, triangle, and pulsewidth. 
  • ~phrase: this function creates a short melodic pattern based on pitch sets received from global variable ~arp. It controls which SynthDef to use, amplitude, phrase length, note duration, and transposition. The last two arguments activate or deactivate that random rhythm generation and arpeggio pattern variation.  
  • ~section: this function duplicates ~phrase. The number of ~phrase and octave transpositions are randomized. The function also makes further variations on amplitude, note duration, and panning.
  • The Routine in the last section uses the ~piece array as a cue list with details on when and how to trigger the ~section. The array ~chords is a list of all the notes in corresponding measures of the Bach Prelude. The Routine also sends a changing pitch set from ~chords to ~phrase via the global variable ~arp.

Anecdote

847 Twins does not use the Adagio section of the Prelude and Fugue. When composing the first movement, I could not transition from a constant 16th-note drive to a free and improvisational ending. I tried to address this incompleteness by writing a complementary movement, Blip, but it did not work out. I made a satisfying solution six months after completing 847 Twins by incorporating an instrument I could improvise aptly and freely. Nim6tet, the sixth track in Fan Art, has six layers of no-input mixer improvisation guided by the chord progressions of the Adagio section. It shamelessly shows off no-input mixer sounds I can not create with other instruments. 

It took many attempts in the period of 1.5 years to finish three tracks about the first half of BWV847. The electronic interpretation of the Fugue part is a puzzle yet to be solved.

More Analysis and Tutorials

Updated on 4/13/2023

Personal Statement – Explanation and Tips

Statement letter for tenure portfolio – music technology

My promotion and tenure portfolio included a personal statement. According to my department’s guidelines, a 3-5 page personal statement “addresses how their research, scholarly, or creative work has developed over time and what activities are likely to be undertaken in the future.” The document is sent to the external reviewers at the beginning of the evaluation process and is read by the department, college, and university committee. Like statement letters of any job application, the document is essential for both applicants and evaluators. 

Tenure-track faculty in music get various degrees of help and guidelines for the personal statement, but examples are rare in the public domain. Information on how to write or evaluate tenure-track faculty specialized in music technology is rarer. Therefore, I shared mine as an example for a tenure-track music technology faculty. I submitted my tenure packet in the Summer 2021 and was promoted to Associate Professor of Music in Spring 2022 at Wayne State University, an R1 public institution. A complete guideline for my WSU’s promotion and tenure packet is here.  

Tips

If you are writing a personal statement for a university tenure evaluation in the field of music technology, here are a few tips.

  • Seek an advisor in your institution. Learn the priorities and process specific to your job. There is a chance that you are the only, and the first, tenure-track music technology faculty. If so, find out what counts as acceptable activities and achievements for the review. (Does my solo performance of a no-input mixer and drum machine count as creative research?)
  • Know that the statement may be read and evaluated by both experts (external reviewers) and non-experts of your field (university-level committee). You need to convince people you do not know that you are good at what you are doing. (How do I convince my colleagues at Math that my no-input mixer and drum machine piece is research?)
  • Provide accessible and measurable evidence. Examples include a list of invited performances, documentation of refereed competitions, journals, conferences, and tracks released by record labels. Other people or institutions accepting your work can prove that your creative outputs are valued. (Multiple conference acceptance and guest artist invitations by other universities prove that my no-input mixer and drum machine are making an impact)
  • Do not be humble. Your department committee will need to convince the college-level committee how great you are. Your college-level committee needs to convince the university-level committee how awesome you are. Give them measurable achievements so that they can advocate for you. (Hey, Joo Won’s no-input mixer piece may not be pleasing, but look at the number of conference acceptance he received to play the piece)

The tenure process was not easy, but it gave me an opportunity to evaluate my career thoroughly. I look forward to writing a new version of the personal statement for the next promotion (which will be many years from now).