August 23, 2017; Web Audio Mostly Concert; Audio Mostly Conference; 93 Feet East; London, England
In the work, a human performer, Cyther (a human-playable robotic zither) and modular percussion robots interact with each other. The interaction between these performers is enabled by both the physical design of Cyther and software written by the composer. The perceptual aspects of the system distinguish auditory events, create groupings and find patterns. In response to perceived information, the system can mimic, transform and generate material. It stores information about past events, and thus has memory, which shape the expressive choices that it makes. It is used in improvisatory contexts to illuminate unique gestures that are only possible through electromechanical actuation, which inspire a human performer to explore new expressive territory. The improvisations provide structure and freedom in order to both present the possibilities of this ensemble and allow for spontaneity. In particular, the work explores rhythms and timbres that are enabled by these machines.
We often think of an instrument and the agent that plays it as unified. That is, we talk about a flautist or a violinist as a single thing that requires both human and instrument working symbiotically together. In other ways, performer and instrument are meaningfully distinct, and the boundary between the two is inflexible. What if this boundary is made porous, allowing the human to play the role of pseudo-static sound shaper while the instrument becomes dynamic and expressive? By integrating robotic actuation into a human-playable instrument, agency becomes amorphous and distributed as performer and machine interact through a shared medium. A human performer and the machine are able to fluidly move between the roles of impulse and filter. The robot inspires the performer with expressions made possible by mechanical actuation while the performer transforms these gestures by physically manipulating the instrument. Reciprocally, the performer can affect how the robotic system both interprets and generates ideas. The results illuminate the expressive spaces that are human, that are mechanical, that are shared between the two, and that emerge as these worlds synthesize. The actions of both become parts of a symbiotic whole, rather than self-contained instances that are co-located, thus the system exemplifies cooperative interaction. The project builds on the lineage of technology that seeks augmentation through human-machine symbiosis. The possibilities offered by such human-playable robotic musical instruments have been little explored (the vast majority of musical robots function autonomously).
for the musical robots PAM and MADI; 3:00
This is a video about my residency as a sponsored artist at High Concept Laboratories in Chicago in the summer of 2013. By Carly Mostar.