HPrizm (a.k.a. High Priest of Anti-pop Consortium)
Iarla Ó Lionáird
This piece grew out of PLOrk member Josh Becker’s final project for MUS/COS314, (Electronic and Computer Music I). Josh developed a software instrument that uses complicated feedback between multiple oscillators to create unstable but fascinating textures. The instrument design included a visual component that created shapes, called “Lissajous figures” from the audio waveform. In exploring how we might use the instrument in performance, we hit upon the idea of performing some music from the Renaissance as a quartet. Three PLOrk members, Josh, Matthew, and Catherine, are all in Dmitri Tymoczko’s MUS315 course this semester, which is focused on algorithmic composition and computational musicology. They decided to create an algorithmically altered version of a Renaissance piece, using the techniques they learned in Tymoczko’s course. Then, Josh asked the question “What if instead of projecting the images, we used lasers?” Indeed, WHAT IF WE USED LASERS?????
music by PLOrk
visuals by Drew Wallace
This piece came about based on an idea floated by PLOrk alum and electrical engineering PhD student Mitch Nahmias a couple of years ago. The lab he works in, the Lightwave Communications Research Laboratory, uses what are called “biological models” of neurons – bits of computer code that are designed to behave the way a neuron would in a brain. They spike in response to certain inputs, and then settle, with a particular pattern and shape to this curve. Some conditions cause oscillatory behavior, where the spikes repeat at some frequency, as they do in an animal brain. He mentioned this to me, and suggested that we try using one of those models to make sound.
Now, a couple of years later, we’ve built a piece around this idea. Aatish Bhatia of the Council for Science and Technology joined the project, and he figured out how to discretize the model (so that it could be run in real-time on a computer). From there, I built a software instrument using this new synthesis technique, and Assistant Director Mike Mulshine wrapped that code into a handy package and composed the musical structure for the piece. Drew Wallace, a senior in Computer Science and Visual Arts, created the live visualization, which represents each performer as a neuron-like object and shows how we are “connected” to each other (which changes throughout the piece).
The instrument itself is interesting and unpredictable. Unlike most electronic musical instruments, in which the controls represent parameters that are meaningful to humans (like “pitch, amplitude, brightness”), this instrument has parameters that are actually quite anti-intuitive to us. We have a knob for the sodium activation channel, another for the potassium channel, another for the capacitance of the system, and so on. These parameters interact in strange ways, and we have been slowly learning to play this instrument by experimenting with these interactions.
A couple of curious asides: the biological model on which we built our instrument is actually a model of a giant squid axon, since apparently that was the only animal with neurons large enough that measurements could be easily taken on single neurons in the 1950’s, when the model was developed. Also, the moment when we all synchronize is what is happening during a seizure.
music by HPrizm and PLOrk
video by Eric Hayes and Jeff Snyder
This piece developed as a collaboration between our guest artist Hprizm and the group. HPrizm had the idea to do something using cut-up video of a dance performance, controlled by the music performers. Around that time, Computer Science student and filmmaker Eric Hayes contacted me about advising him on a junior project, and I roped him into doing the video development for this piece. Eric was interested in using the motion capture system in the Council for Science and Technology’s Studiolab, and he designed some wearable markers so that he could track the movements of several points on a dancer’s body. He recorded captures of several dancers from different stylistic backgrounds, from modern dance to hip-hop, and created a system that visualizes this data and allows the musicians to control and interact with the visuals. HPrizm had the idea to integrate things one step further, by using the data from the motion capture to control the musicians’ audio parameters as well, so we can use the curve of a dancer’s hand motion and apply that shape to a filter sweep, for instance.
Three of the performers in this piece are playing on analog modular synthesizers, using a sequencer interface called the MantaMate, which was developed over the past few years by the performers playing it, Snyder, Becker, and Mulshine. Other Princeton students who have been involved in the development of the MantaMate include Elaine Chou, YC Sun, and Chloe Song, all supported by funding from the Keller Center.
Anna Kimmel '18
Raheem Barnett '18
Dorothy Chen '17
Lauren Auyeung '19
My God It’s Full of Stars
poem by Tracy K. Smith
This piece was inspired by a poem written by Princeton creative writing professor Tracy K. Smith. Iarla, an Irish folk singer who is joining us as a guest artist, loved the poem, and wanted to do a piece that responded to its celestial themes. Priest wrote his own verse in response to the poem, and we built a piece around these contributions. PLOrk members and composition PhD students Florent Ghys and Chris Douthitt handled the compositional and arrangement duties to bring form to the floating ideas that were coalescing among the group.
Here’s the poem, the third section of Smith’s My God It’s Full Of Stars:
Perhaps the great error is believing we’re alone,
That the others have come and gone—a momentary blip—
When all along, space might be choc-full of traffic,
Bursting at the seams with energy we neither feel
Nor see, flush against us, living, dying, deciding,
Setting solid feet down on planets everywhere,
Bowing to the great stars that command, pitching stones
At whatever are their moons. They live wondering
If they are the only ones, knowing only the wish to know,
And the great black distance they—we—flicker in.
Maybe the dead know, their eyes widening at last,
Seeing the high beams of a million galaxies flick on
At twilight. Hearing the engines flare, the horns
Not letting up, the frenzy of being. I want to be
One notch below bedlam, like a radio without a dial.
Wide open, so everything floods in at once.
And sealed tight, so nothing escapes. Not even time,
Which should curl in on itself and loop around like smoke.
So that I might be sitting now beside my father
As he raises a lit match to the bowl of his pipe
For the first time in the winter of 1959.
Ndivumbamirewo + Nyong’o
by Marshall Munhumumwe and the Four Brother
transcribed by Florent Ghys
arranged by PLOrk
Florent Ghys brought in this piece, which he transcribed from a recording by the Zimbabwean pop group the Four Brothers. Our arrangement takes a lot of liberties with it, but we think it maintains the spirit of the original tune. There are two new digital instruments developed by PLOrk members that we are using to realize our version. One of them is a live coding environment designed by Avneesh Sarwate, which allows the performers to record incoming audio and then re-arrange it into complex rhythms through a simple syntax built on the computer language Python. The other was developed by graduate composer Chris Douthitt as a project for Kofi Agawu’s African Music course. Douthitt’s software uses the concept of expressing rhythms as “inter-onset intervals”, the time between the beginnings or attack points of successive events or notes, which is a common way to describe African bell patterns. The software allows the performers to define two rhythmic patterns in this notation, and then “drift” between them, interpolating where the beats should occur as the rhythms morph into each other.