for two percussionists and electronics
electro-acoustics / 4-channel audio
Youngjoo Ryu Portrait Concert,
Lee Kang Sook Hall,
Seoul, South Korea
JUL 6, 2023
⟪Percussion Duo Moitié 12th Project "DECODE”⟫
Seoul Arts Center IBK Chamber Hall,
Seoul, South Korea
(winner from the Call for Scores 2023 of Percussion Duo Moitié)
People live their lives on their own trajectories. Different trajectories of variant directions, with other ways of movement and different shapes of trails left behind, overlap and intersect to form a connection.
This piece focuses on the centrality of 'hitting' and 'tapping' in percussion performance. It captures the change of timbre when the instruments, the object of the 'tapping,' change. 'Trajectory' also focuses on the experiment of the sound changing by the movement of the two performers.
The performers move along the instruments arranged in an arc, forming a trajectory. Each movement, starting from a distance, converges to a point. When the separated instruments meet at the point of contact, the two trajectories blend, covering each other's territory with the same timbre. The combination of the trajectories of the percussion instruments played alongside the electro-acoustic sound, their meeting at the point, and the combination of the points and the trajectories creates a timelessly changing tone.
• Techniques Employed
SuperCollider
Max/MSP
- Effects
- Live sound processing
- ICST Ambisonics plugin
- Master controller for the performance
Ableton Live
Sound Recording
Score
Production Details
The piece, Trajectory, was created between September and December 2022 and between September and December 2023.
• Goal
To explore how sound interaction and mixture vary based on the two performers’ locations or paths while performing on two identical percussion setups.
• Composition Process
I started by choosing the types of percussion instruments to use and considering their arrangement on stage. I planned where the two percussionists would move and the trajectory of their movements according to the piece's flow. I then wrote the acoustic score and worked on the electronics in parallel. The acoustic score was first handwritten and then notated using the Sibelius program. For the performing technique symbols not included in the Sibelius, custom symbols were created and applied manually.
• Electronic Music Components
Both fixed electronics (tape) and live electronics were used. The tape was composed in Ableton Live using two types of sources.
- Sample recordings of the sound synthesis outputs from SuperCollider.
-
Microphone recordings of the percussion instruments recorded during sessions with percussionists.
Live Electronics
For the live electronics, a 6000Hz high-pass filter was applied to the microphone input so that the effects would only be applied for high-pitched percussion instruments such as crotales and triangles. Then, pitch shift and flanger effects were applied to output in real time. Microphone inputs were taken from the microphones located in front of Performer A and Performer B, and effects applied sounds were played at the speaker located at the opposite side. (i.e., sounds from Performer A were played at the right side of the audience).
• Max/MSP Integration
Max/MSP was also used as a full song controller and 4-channel ambisonics. The entire tape and live electronics cues were run using the metro object, with click tracks sent to the two performers’ in-ear monitors.
For the tape, instead of using the full track exported from Ableton Live, STEM sources grouped by sound type were used. Using the STEM sources allowed for more detailed control over the sound, with 19 tracks played simultaneously on Max/MSP.
The ICST Ambisonics plugin was used for 4-channel output, assigning ambisonics points to each track and creating a function to move each point. The live electronics sources were similarly positioned within the ambisonic field for playback.