By Jessica Messier; March 4, 2019
S. Kemper, S. Barton (2018). In Proceedings of the 18th International Conference on New Interfaces for Musical Expression (NIME). Blacksburg, VA.
K. Sundberg, S. Barton, A. Walter, T. Sane, L. Baker, A. O’Brien (2018). In Proceedings of the 18th International Conference on New Interfaces for Musical Expression (NIME). Blacksburg, VA.
April 3, 2018; WPI; Worcester, MA
December 9, 2107; Boston, MA
October 15, 2017; Robot Block Party, HUBWeek 2017; Boston, MA
August 23, 2017; Web Audio Mostly Concert; Audio Mostly Conference; 93 Feet East; London, England
P. Mitrano, A. Lockman, J. Honicker, S. Barton (2017). In proceedings of The 5th International Workshop on Musical Metacreation (MUME) at The 8th International Conference on Computational Creativity (ICCC). Atlanta, GA, USA.
June 10, 2017; WPI; Worcester, MA
S. Barton, E. Prihar, P. Carvalho (2017). Cyther: a human-playable, self-tuning robotic zither. In proceedings of The 17th International Conference on New Interfaces for Musical Expression. Copenhagen, Denmark.
In the work, a human performer, Cyther (a human-playable robotic zither) and modular percussion robots interact with each other. The interaction between these performers is enabled by both the physical design of Cyther and software written by the composer. The perceptual aspects of the system distinguish auditory events, create groupings and find patterns. In response to perceived information, the system can mimic, transform and generate material. It stores information about past events, and thus has memory, which shape the expressive choices that it makes. It is used in improvisatory contexts to illuminate unique gestures that are only possible through electromechanical actuation, which inspire a human performer to explore new expressive territory. The improvisations provide structure and freedom in order to both present the possibilities of this ensemble and allow for spontaneity. In particular, the work explores rhythms and timbres that are enabled by these machines.
We often think of an instrument and the agent that plays it as unified. That is, we talk about a flautist or a violinist as a single thing that requires both human and instrument working symbiotically together. In other ways, performer and instrument are meaningfully distinct, and the boundary between the two is inflexible. What if this boundary is made porous, allowing the human to play the role of pseudo-static sound shaper while the instrument becomes dynamic and expressive? By integrating robotic actuation into a human-playable instrument, agency becomes amorphous and distributed as performer and machine interact through a shared medium. A human performer and the machine are able to fluidly move between the roles of impulse and filter. The robot inspires the performer with expressions made possible by mechanical actuation while the performer transforms these gestures by physically manipulating the instrument. Reciprocally, the performer can affect how the robotic system both interprets and generates ideas. The results illuminate the expressive spaces that are human, that are mechanical, that are shared between the two, and that emerge as these worlds synthesize. The actions of both become parts of a symbiotic whole, rather than self-contained instances that are co-located, thus the system exemplifies cooperative interaction. The project builds on the lineage of technology that seeks augmentation through human-machine symbiosis. The possibilities offered by such human-playable robotic musical instruments have been little explored (the vast majority of musical robots function autonomously).
November 12, 2016; Third Live Studio; Somerville, MA;
Nate Tucker – percussion,
Cyther (human-playable robotic zither), Robotic Percussion
October 7, 2016; Franklin Pierce University; Rindge, NH
S. Barton (2016). In proceedings from The 1st Conference on Computer Simulation of Musical Creativity.
Abstract: This paper explores musical, psychological and philosophical ideas about how humans and machines interact in creative processes. It argues that creativity is a function of both generator and receiver, and that these roles can be amorphous in the creation and experience of electronic music. It offers an approach to structuring temporal spaces for rhythmic composition, which leads to the idea of machine rhythms, which are proposed as a promising area for creative expression.
August 11, 2016; Original Gravity All-Star Concert at Rising Tide; Portland, ME
June 14, 2016; by Julia Quinn-Szcesuil
June 9, 2016
June 18, 2016; 1st Conference on Computer Simulation of Musical Creativity; University of Huddersfield; UK
Join Original Gravity Artistic Director Keith Kirchoff and Public Relations Guru Greg Carlson (both exceptional brewers) as they design a unique, homebrewed beer that will aesthetically pair with Scott Barton’s Rise of a City, a piece for musical robot and guitar featured on our Summer 2016 Mystic Brewery concert.
Febrary 15, 2016; Expressive Engines Concert; Rutgers University; New Brunswick, NJ
for flute, clarinet, piano, violin, cello, tom-tom, PAM (robotic string instrument), and robotic percussion
Fall 2014 – Spring 2015
commissioned by the Juventas New Music Ensemble
Machine expressivity is often thought of as involving precision, speed, rhythmic complexity, non-idiomatic (for human performers) pitch patterns and replication. Human expressivity is often thought of involving groove, phrasing, affect, contour, variation, articulation, entrainment and communication. While these attributes help shape our conceptions of what is human versus what is mechanical, they are not confined to one category or the other: humans can be precise and robots can groove. Expressive identity is more analog than digital. This does not preclude expressive spaces that are unique to humans and machines, rather, it suggests the areas between them are ambiguous and that the attributes that define them do not do so in a one-to-one fashion (instead, attribute-space relationships are a function of combination and context). The music explores these areas of ambiguity and clarity. Genre is treated in a similar way such that stylistic exemplars are presented authentically and in transformation. The intersections in expressive identity and style illuminate what is exclusive and what is shared.
Febrary 12, 2016; SEAMUS National Conference; Carter Recital Hall, Georgia Southern University; Statesboro, GA
for guitar and robotic ensemble (PAM, MADI and CADI)
produced and recorded by Scott Barton, mixed by Marc Urselli and Scott Barton at East Side Sound Studios, NYC
Rise of a City introduces a human performer to the robotic creations of EMMI (Expressive Machines Musical Instruments, expressivemachines.com) for the first time. The piece features complementary string parts (one played by a human guitarist, one played by the robotic string instrument PAM) that are supported by a robotic percussion ensemble. The piece explores mechanical gestures, human expression, virtuosity and synchronicity by placing specific musical ideas in a variety of instrumental and temporal spaces. Material is soloed and shared between the parts, giving us a sense of the unique expressive characteristics of human versus robotic instrumentalists. From the perspective of narrative, the musical interactions between human and machine can be understood through the metaphor of how ideas develop. Sometimes multiple groups of people simultaneously cultivate ideas towards similar goals even though they don’t live in the same place. Sometimes the paths of this race are parallel. Sometimes they diverge. When they diverge, the separation can result in either an alternate route to the original goal or a new path(s) that clears the way to previously unimagined possibilities. This has become a familiar phenomenon to us via technological innovation, scientific discovery, stylistic innovation and the construction of physical communities (dwellings → cities). The construction of physical communities has particular metaphoric weight in the case of this piece. From small beginnings a city exudes reiterative processes in multiple directions. New neighborhoods spring up that incorporate and / or react to adjacent areas. The restatements are accumulative, so that the entirety of the city becomes perpetually more massive and complex. At the same time, the most recent individual additions, buildings in the case of a city, mirror the qualities of the whole in terms of grandeur and intricacy. This path is not purely linear of course, and the ability to start simply, small-ly, or differently is always preserved.
December 12-13, 2015; Boston Museum of Science; Boston, MA.
November 21, 2015; Project Fusion; Killian Hall, MIT; Cambridge, MA. Performed by the Juventas New Music Ensemble.
November 13, 2015; Spaulding Hall, WPI; Worcester, MA. Performed by the Juventas New Music Ensemble.
September 26, 2015; Boston, MA; improvisation with Cyther (human-playable robotic zither) and Nate Tucker – percussion
July 24, 2015; Spotify Headquarters; NY, NY
T. Rogers, S. Kemper, S. Barton (2015). In proceedings from The 15th International Conference on New Interfaces for Musical Expression.
June 13, 2015; WPI; Worcester, MA
June 3, 2015; T. Rogers, S. Kemper, S. Barton. Paper presented by the second author at The 15th International Conference on New Interfaces for Musical Expression; Baton Rouge, LA.
S. Barton, S. Kemper (2015). Published in UTS ePRESS; March, 2015.
The paper was presented at the International Conference on Social Robotics 2014. link to paper
March 2015; by Ronni Reich
February 27, 2015; Hamilton, NY. Human-Robot Improvisation and live coding with musical robots performed
February 26, 2015; Syracuse, NY. Human-robot improvisation and live coding with musical robots performed.