All posts by joowonpark

Seoseok Bell – Brief Analysis

Seokseok Bell is a track in Dot Zip, an album of 22 generative music. The album’s purpose is to demo a uniquely electronic sound rendered with codes. Each track has a downloadable SuperCollider code that a listener can render and modify.  Listen to SeoSeok Bell at Bandcamp and download the SuperCollider code from here

The following paragraphs analyze the form, code, and musical aspirations in making Seokseok Bell. It teaches how to start and progress a composition from a single synthesized sound. The learning is most effective if the reader has a SuperCollider installed on their computer. Please watch a tutorial video on how to run SuperCollider codes written for Dot Zip.   

Program

Seoseok (서석) is a small town in the mountainous region of Korea. The sound of the bell in a chapel in the town reminds me of peace and love. The piece recreates (or interprets) the bell sound using an additive synthesis-like process and then presents it in an ambient-like style. 

Form

Seoseok Bell creates a bell-like tone by adding multiple sine waves. The bell tones and a simple bass line then make a three-part contrapuntal music. The resulting music has many variations due to the randomization in overtone frequencies, note sequence, and rhythms. The SuperCollider code SeoSeokBell_DotZip.scd does this through the following steps.

  • Step 1: Make two sine waves detuned to each other with a randomized frequency difference, creating a single tone with a pulse.
  • Step 2: Create an overtone series. The notes in the overtone series are randomly detuned.
  • Step 3: Play the sound multiple times with short, randomized time intervals.
  • Step 4: Generate soprano and tenor parts by randomly choosing a note in a scale. At the same time, generate a bass part with simpler overtones in tune.

Code

SeoSeokBell_DotZip.scd has the following sections. Watch a tutorial video on how to use the code.

  • SynthDef(“SingleB”): synthesizes sound described in Step 1
  • ~bell: makes sound described in Step 2
  • ~shake: make sound described in Step 3
  • ~sop, ~tenor,  and ~bass: make sound described in Step 4 
  • SynthDef(“NiceB”): synthesizes bass tone described in Step4
  • SystemClock.sched: schedules start and stop time of ~sop, ~tenor, and ~bass

SynthDef(“SingleB”) and SynthDef(“NiceB”)

The two SynthDefs use simple waveform generators (SinOsc.ar and LFPulse.ar) as audio sources. SynthDef(“SingleB”) uses a percussive amplitude envelope with randomized attack and release times. The envelope also includes a transient generated with LFNoise2.ar. The  SynthDef(“NiceB”) has an envelope on the filter frequency of RLPF.ar

~bell

In ~bell function, SynthDef(“SingleB”) is duplicated using Routine. The below formulas determine the frequencies of the duplicated Synths.

pitch=(freq*(count))*rrand(0.99,1.01);
pitch2 =pitch*(interval.midiratio)*rrand(0.99,1.02)*rrand(0.99,1.02);

Where argument count is increasing by 1 at every iteration of a .do loop

Once defined, ~bell function generates a sound using the following arguments:

~bell.(fumdamental frequency, amplitude, duration, pan position, interval value of overtones)

~shake

~shake duplicates function ~bell with a Routine with randomized .wait, creating a slight delay between the instances of Synths. Once defined, the ~shake function generates a sound using the following artumdnts:

~shake.(fumdamental frequency, amplitude, duration, interval value of overtones, delay time)

~sop, ~tenor, and ~bass

The three functions ~sop, ~tenor, and ~bass are Routines that play ~shake or Synth(“NiceB”)  with frequencies picked from the array ~scale or ~scalebass. The global variables ~bpm and ~beat determine the wait time. The three Routines receive .play and .stop messages according to the timings set by SystemClock.sched.

Uniquely Electronic

In electronic music, a sound design process is often the starting point of a composition. Seoseok Bell began as an exercise inspired by the Risset Bell. I wanted to create a bell sound using additive synthesis. However, such an exercise should not end as a sound design only. The composer or researcher should present the findings in a musical context

More Analysis and Tutorials

What I Remember About My First Gig

I scanned a photo of my first electronic music improvisation gig in 2002.

It counts as the first gig, for it was the first performance in front of an audience that did not include anyone I know. I also played a complete set with an electronic instrument for the first time. The concert was probably in late 2001 or early 2002, and I don’t remember much of it other than bits of incidents and happenings. A personal keyword unifying the gig is uncomfortable.

  • It was the second time in my life traveling to Brooklyn.  When I arrived at the performance space, everybody except me seemed confident about what they were doing. I was forcing myself not to show my newbieness. It felt weak to show how impressed I was with others’ art and sound.
  • The event organizer introduced himself as Doc. Doc provided a place to hang out in an apartment and food for all the performers. He made a soup (a chili?) with too much ginger. After hurrying to eat the soup among people I didn’t know, I stayed in the apartment’s hallway.  In hindsight, everybody was nice to me. I just did not know how to react to kindness from strangers.
  • I don’t remember much about the performance. From the looks of the picture, I was performing nervously and seriously. I had the attitude of playing in a college recital hall, but the stage was a folding table in a dark basement with DIY lighting. I did not make eye contact or interact with the audience.
  • I felt I did not belong to the event and the culture it belonged to. So I slept early in a room of a person I do not know, woke up at dawn, and hurried to the bus stop. I didn’t say thank yous or goodbyes.

That was my first and queasy gig. The quality of the music I presented was OK, but the quality of social performance were bismal. I could have made friends and fans, but I ran away. Now that 20+ years have passed since the first gig, I feel comfortable socializing with strangers (if needed).  It took me a while to feel like that. Perhaps teaching helped.  I share this experience with my students, who are younger than the 2001 me, to let them know that it is OK to feel bad after the gigs. The career does not end there. Just do more performances, make a few more mistakes, and find a way to feel comfortable showing what you love in front of people you do not know. 

I wish I had audiovisual documentation of the performance, but I had a Motorola cell phone at the time. However, I found a backup of a video demonstration Luis and I made a few weeks after the Brooklyn Performance. It is delightful to see how much my musical practice has changed and remained the same since 2002.

I used a loaned Radio Baton, and my friend Luis Maurette used my Phat-boy MIDI controller. We built a Max patch for the machine and ran it on my very first iBook. The video was shot in an ensemble practice room at Berklee College of Music. Luis and I were Electroic Production and Design students (back then, the major was called Music Synthesis). Ableton Live was just released a few months ago.

Game Controller Comparison For SuperCollider

SuperCollider’s HID class lets users connect many types of game controllers without additional software installations. While establishing a connection between a human input device and SuperCollider is easy, decoding the mapping system requires time and effort. A user needs to press a button in the controller, monitor the ID number assigned to the specific button, and then document the result for further use. The good news is that once a user figures out and shares the mapping structure of a device, there is no need to repeat the discovery procedure until the next major OS update.

The chart below shows the control surface-to-SuperCollider mapping of four game controllers: DualShock 4 for PlayStation 4, DualSense for PlayStation 5, Logitech FS310, and Xbox Wireless Controller. It lists elements (ID numbers corresponding to a specific control surface in a device) and their value range for HIDFunc.element command. With the element numbers and ranges below, one can map the incoming data from a controller to various parameters available in SuperCollider.

Element Numbers and Range of Game Controllers

Sony DualShock (PS4)Sony DualSense (PS5)Logitech FA310 Mode DMicrosoft Xbox
BluetoothBluetoothUSBBluetooth
ID (1356, 2508)ID (1356, 3302)ID (1133,49686)ID (1118, 765)
Control Surface [sony | xbox]Element#, (value)Element#, (value)Element#, (value)Element#, (value)
Square | X0, (0 or 1)0, (0 or 1)0, (0 or 1)13, (0 or 1)
X | A1, (0 or 1)1, (0 or 1)1, (0 or 1)10, (0 or 1)
O | B2, (0 or 1)2, (0 or 1)2, (0 or 1)11, (0 or 1)
Triangle | Y3, (0 or 1)3, (0 or 1)3, (0 or 1)14, (0 or 1)
L1 | LB4, (0 or 1)4, (0 or 1)4, (0 or 1)16, (0 or 1)
R1 | RB5, (0 or 1)5, (0 or 1)5, (0 or 1)17, (0 or 1)
L2 | LT button6, (0 or 1)6, (0 or 1)6, (0 or 1)not found yet
R2 | RT button7, (0 or 1)7, (0 or 1)7, (0 or 1)not found yet
Share | Back8, (0 or 1)8, (0 or 1)8, (0 or 1)25, (0 or 1)
Options | Start9, (0 or 1)9, (0 or 1)9, (0 or 1)21, (0 or 1)
L3 | LSB button10, (0 or 1)10, (0 or 1)10, (0 or 1)not found yet
R3 | RSB button11, (0 or 1)11, (0 or 1)11, (0 or 1)not found yet
Logo12, (0 or 1)12, (0 or 1)not found yetnot found yet
Trackpad button13, (0 or 1)13, (0 or 1)N/AN/A
L3 | LSB x-axis14, (0 to 255)14, (0 to 255)13, (0 to 255)0, (0-65535)
L3 | LSB y-axis15, (0 to 255)15, (0 to 255)14, (0 to 255)1, (0-65535)
R3 | RSB x-axis16, (0 to 255)16, (0 to 255)15, (0 to 255)2, (0-65535)
R3 | RSB y-axis17, (0 to 255)17, (0 to 255)16, (0 to 255)3, (0-65535)
L2 | LT continuous19, (0 to 255)19, (0 to 255)not found yet26, (0-1023)
R2 | RT continuous20, (0 to 255)20, (0 to 255)not found yet27, (0-1023)
up18, (0)18, (0)17, (0)28, (1)
up+right18, (1)18, (1)17, (1)28, (2)
right18, (2)18, (2)17, (2)28, (3)
right+down18, (3)18, (3)17, (3)28, (4)
down18, (4)18, (4)17, (4)28, (5)
down+left18, (5)18, (5)17, (5)28, (6)
left18, (6)18, (6)17, (6)28, (7)
left+up18, (7)18, (7)17, (7)28, (8)
release18, (8)18, (8)17, (8)28, (0)

Link to Google Sheets version

Analysis of Element Numbers and Range

The number of available control surfaces, as well as their ranges, varies between the brands. However, all game controllers have three types of input methods. 

  • Button: sends 1 when pressed and 0 when released.
  • Directional Pad: assigns eight integers for eight directions and one integer for the release/unpressed state. The release state is mapped to value 8 in the Sony and Logitech controllers, while it is assigned to 0 in the Xbox controller.
  • Continuous Control: sends a range of numbers like a slider or a knob in a MIDI controller. All but the Xbox controller sends data ranging from 0-255. The Xbox controller has two types of ranges (0-65535 and 0-1023).

More expensive controllers have more features, such as motion sensors and microphone input, but the HID class does not detect them. The number of game controller features available for HID seems to depend on the hosting OS’s version. In 2018, HID received DualShock’s motion sensor and trackpad data when connected to a Macintosh with a USB cable (source). After a few OSX updates, I could not replicate the result in 2025. It is possible that additional input methods could be detected if an external app or extension is installed. I did not test the possibility as my goal is to avoid additional technical steps. 

Demo SuperCollider Patches

Readers can test, study, and modify the controller mapping structure with the .scd files provided in the link above. All four files, one for each model of the controller, have the same parts: 1. Controller initialization 2. Connection tester 3. Controller-to-sound example. The coding style is based on the example section of the HIDFunc manual.   

The Controller initialization section consists of arrays and functions that connect and monitor signals from the game controller. The most important commands in this section are HID.findAvailable, HID.open, and HIDFunc.element. These commands detect the available devices, connect the specific device with the device’s unique ID number, and determine what SuperCollider should do when a control surface is triggered. The rest are functions built to use those three commands efficiently. The users must evaluate the codes inside the first parenthesis labeled //1. Controller Initializations to make the second section of the codes work.

After the initialization, evaluate the codes inside the second parenthesis labeled //2. Test the Controller. It is important to evaluate one line of code at a time for better testing. When a ~whichsurface(element #) function is evaluated, an array containing an element number, control surface name, and value will appear on the post screen. For example, after establishing a connection with a DualSene controller using HID_DualSense_Demo.scd, evaluate the line ~whichsurface.(3). It will activate the triangle button of the controller. Pressing the triangle button will post [ 0, Triangle, 1 ], and releasing the button will post [ 3, Trianle, 0 ] on the Post window. Press command+period or select Menu-> Language-> Stop to stop receiving the messages from the controller. 

Note that arguments (|…args|) inside the HIDFunc element receive an array of numbers from the connected controller. The first number in the array, args[0], receives normalized data ranging from 0.0 to 1.0 as a float. The second number, args[1], receives raw data ranging from 0 to any number as an integer. The demo files use the integer, but users can use the float argument interchangeably. 

The last group of commands under //3. Audio Example maps the controller to a SynthDef’s parameter. The codes inside HIDFunc.element maps a button to change the volume of white noise and a continuous controller to change the panning. Modify the numbers inside the brackets [ ] at the end of the HIDFunc.element( ) to map different buttons and continuous controllers. Note that the method .linlin is used to map the range of 0 to 255 or 65535 to -1.0 to 1.0 in this example code.  

Summary and Application 

I share the mapping of four controllers so that others do not have to repeat the procedure and move faster to the creative phase. My previous sharing of the game controller mapping was in 2018, and it is available as a published document in The Journal Emille. The article is now outdated and only applicable to DualShock.  The current findings discussed here also have limitations: the mappings were tested in SuperCollider 13.0 running in Mac OSX Sequoia only. I was not successful in connecting the controllers to SuperCollider on Windows. I also did not figure out the mappings for Max or other programs.

Once connected via HID in SuperCollider, a game controller becomes an expressive instrument for laptop performance despite the above limitations. Performers can incorporate years, if not decades, of gaming muscle memories to play music. For demonstration, the links below are my two compositions for a game controller quartet. Interested music technologists can download SuperCollider patches and scores from the links below to play the music using the game controllers analyzed in this article. Previous experience in SuperCollider is not needed to play the piece.

PS Quartet No. 1

PS Quartet No. 2

Electronic Ensemble Repertoire – Classics

Here are three pieces I have presented regularly with the Electronic Music Ensemble of Wayne State (EMEWS). The repertoire’s codes, scores, or DAW project files are available online and are simple to set up and execute in terms of technology.  An ensemble director may make the piece presentable in one or two rehearsals with no extra cost for preparation or concert. 

John Cage, Four6

Four6 is an open-instrumentation piece suited for four electronic musicians. The performer is asked to prepare twelve different sounds before the performance. Then, they play the sounds according to the timeline dictated in the score. There are no tech specifications (any instrument is acceptable), and performers do not need to know how to improvise or read a traditional notation or improvise. 

I learned to play this at a concert organized by the fidget in 2012. Since then, the resulting sound of the quartet has been delightful to both the audience and the performer. In EMEWS concerts, the four parts were sometimes doubled to accommodate a large ensemble. The performers changed their twelve sounds for each practice and performance to keep surprising the other performers. 

I don’t have a link to the score, but they are easy to purchase. A nearby contemporary music performer friend probably has a copy. 

Alvin Lucier, Vespers

Vespers turns the acoustic space into an interesting instrument. I lead EMEWS to play this piece in the first weeks of the semester so the performers learn the musical application of space, resonance, and movement. The instruction asks the performer to walk around a dark room with a device that makes clicking sounds. The performer’s task is to find and share a location that makes the clicking sound interesting. In short, the performers become an organism with echo-location capacity. Any number of performers can play together.

The original instruction asks the performer to use a Sondol, but I don’t know what that is. So, I made a SuperCollider patch that makes clicks with controllable rates and duration. I added a feature to change the background color of the computer screen for an extra visual effect. I also thought a more directed performance might benefit the performers with little experience in experimental music, so I arranged a version with additional guidelines. The resulting scores and media are found here https://joowonpark.net/vespers/ 

Terry Riley, In C

Electronic ensembles can jump on the bandwagon by performing In C, one of contemporary music’s most popular ensemble pieces. For the electronic ensemble performers, I made a Logic patch that uses loop functions. Performers of any notation-reading level can play In C by clicking a loop at a desired pace. 

Pre-programmed melody and rhythm, stored as loops, let the performers contribute different musical aspects. I ask my ensemble members to experiment with timbre. The performers can double the track with a different patch, change the filter settings, add effects, instrument settings, etc. They are to explore the uniqueness of electronic instruments – what can an electronic instrument do that others cannot? 

Visit  https://joowonpark.net/logicinc/ for detailed instructions. I am positive that a similar loop setup is possible on Ableton Live and other platforms. 

Four Hit Combo (2024)

In Four Hit Combo, each laptop ensemble member uses four audio files to create twenty-six flavors. Musical patterns arise from repetitions (loops), and different combinations mark forms in music. The laptop ensemble members prepare their own samples before the performance, and they control loop start points and duration according to the score and the conductor’s cue. Because there are no specific audio files attached to the piece, each performance could give a unique sonic experience.

Instrument Needed

  1. Laptop: each performer needs a computer with SuperCollider installed
  2. Amp: connect the laptop to a sound reinforcement system. If the performance space is small, it is possible to use the laptop’s built-in speaker.

Pre-Performance Preparation

  1. Determine a conductor and at least three performers. If there are more than three performers, parts can be doubled
  2. Each performer prepares three audio files (wav, aif, or mp3). The first file should contain a voice. The second file should contain a pitched instrument sound. The third file should contain a percussion sound. All files should not be too short (less than a second) or too long (more than a minute). The [voice], [instrument], and [percussion] files should be different for all performers.
  3. While the voice, instrument, and percussion files are different for all performers, they should share one common sound file. This file will be used in the [finale].  
  4. The conductor prepares one audio about 10-30 seconds long. It could be any sound with noticeable changes. For example, a musical passage would work well, while an unchanged white noise would not. 
  5. Download FourHitCombo_Score.pdf, FourHitCombo_Performer.scd, and FourHitCombo_Conductor.scd from www.joowonpark.net/fourhitcombo
  6. Open the .scd files in SuperCollider. Follow the instructions on the.scd file to load the GUI screen.

Score Interpretation

  1. Proceed to the next measure only at the conductor’s cue. The conductor should give a cue to move on to the next measure every 10-20 seconds.
  2. In [voice], [instrument], [percussion], and [finale] rectangle, the performers drag-and-drop the audio file accordingly.
  3. In [random] square, performers press the random button in the GUI.
  4. In the square with a dot, quickly move the cursor in the 2D slider to the notated location.
  5. In the square with a dot and arrow, slowly move the cursor from the beginning point to the end point of the arrow. It is OK to finish moving the cursor before the conductor’s cue.
  6. In a measure with no symbol, leave the sound as is. Do not silence the sound.
  7. In measure 27, all performers freely improvise. Use any sounds except the commonly shared sound reserved for [finale].