June 22, 2022. New York City Electronic Music Festival, The Loreto Theater; New York
Archive by Author
Mechanophore performance at CMMR
November 15-19, 2021; The 15th International Symposium on Computer Music Multidisciplinary Research (CMMR); Online / Tokyo, Japan
Context-specific Knowledge Is the “Key” to Salsa Music
L. Getz, S. Barton & L. Perry (2021). Auditory Perception & Cognition, DOI: 10.1080/25742442.2021.1964341
Mechanophore Performance at NIME
June 16, 2021. NIME (New Interfaces for Musical Expression) Conference, 2021; Online and NYU Shanghai, China
Mechanophore
2021
for virtual and robotic strings and percussion
by Scott Barton
robotic instruments by WPI’s Music Perception and Robotics Lab and EMMI
Mechanophore was inspired by the force-sensitive molecular units of the same name. As mechanophores are subjected to physical forces, they activate chemical reactions that can communicate their state (e.g. color change) or even heal themselves. The musical work represents this process of increasing tension to the point of ring opening, out of which a texture whose nature ascends and heals emerges. The second section represents a particular mechanophore, spiropyran, more literally by tracing the molecule’s skeletal structure in its pitch contours. Just as force makes spiropyran transform into a different molecule (merocyanine), the musical theme morphs into new configurations as it progresses. After another ring opening, the final section of the work represents interactions between individual polymers within a material, which can be characterized by entanglement, bridging, paths of motion, qualities, sizes, velocities, densities and loops.
More philosophically, the piece shows the wonder and complexity of the microscopic world through sonic elements that border on the threshold of perceptibility. As polymer science brings the distinction between the ideas of organic and synthetic into focus, the music illustrates the continuum between these poles through various kinds of virtual and acoustic instruments (including the robotic string instruments PAM and Cyther) that are combined and manipulated in a panoply of ways. Spiropyran elastomers were used as membranes for PVC drums played by robotic actuators made from 3D-printed PLA, thus connecting the metaphors of the work to its physical realization. Mechanophore was commissioned by the Multiverse Concert Series in collaboration with the polymer scientists of the MONET group.
Tempo Mecho
2019
for the robotic instruments PAM, modular percussion and percussive aerophone (built by WPI’s MPR Lab and EMMI)
by Scott Barton
A groove changes identity depending on the tempo it inhabits. Typically, there are small ranges within which a rhythm feels at home. Once there, a rhythm reveals the energy, detail and character of its true self. Some rhythms are travelers, able to assimilate into contrasting locales. Some rhythms are chameleons, changing their colors depending on their temporal context. tempo macho explores some of the ways that tempo change can affect our sense of musical material. Here, tempos can change gradually and also can shift abruptly according to a variety of mathematical ratios. These movements occur over a range of time scales to illuminate the rhythmic limits of short-term memory and what is required to entrain to a cyclic pattern that reveals a groove. A theme persists throughout the piece to make these rhythmic aspects, which also include unusual meters and polyrhythms, clear. The temporal complexity of these ideas finds a natural voice through mechatronic instruments.
Mechanophore performance at SEAMUS
April 24, 2021. SEAMUS (Society for Electro-Acoustic Music in the United States) 2021 conference; virtual
Mechanophore performance at Multiverse Concert Series
January 29, 2021. Art of Polymers Program; Multiverse Concert Series (virtual)
Human-Robot Improvisation at SEAMUS
May 30, 2020. SEAMUS (Society for Electro-Acoustic Music in the United States) 2020 Conference; virtual
Human-Robot Improvisation with Nate Tucker
March 9, 2020. Dance Complex; Cambridge, MA
Circularity in Rhythmic Representation and Composition
S. Barton (2020). Proceedings of the International Conference on New Interfaces for Musical Expression, Birmingham City University, pp. 505–508.
Parthenope: A Robotic Musical Siren
M. Sidler, M. Bisson, J. Grotz, and S. Barton (2020). Proceedings of the International Conference on New Interfaces for Musical Expression, Birmingham City University, pp. 297–300.
Musical Robotics Presentation at Fiske Elementary School
February 26, 2020; Fiske Elementary School; Wellesley, MA
Musical Robotics Presentation at Boston Museum of Science
December 14, 2019; Cambridge, MA
Live Coding Musical Robots at Algorave
December 11, 2019; WPI; Worcester, MA
Human-Robot Improvisation at Moment’s Notice
December 4, 2019; Boston Conservatory; Boston, MA
WPI Professor and Students Create Robot-Human Sound Fusion Concert
October 14, 2019; by Sharon Gaudin; The Herd
Mechanics Hall concert begs question, ‘Is there music without a human?’
October 28, 2019; by Tyran Grillo; telegram.com
Man, machine and music: WPI robots and live musicians to share stage at Mechanics Hall
October 24, 2019; by Richard Duckett; telegram.com
Robotic Instruments Will Take The Stage This Weekend In Worcester
October 25, 2019; by Kim Tunnicliffe; WBZ Radio
Machinic Tides at the Robotics Engineering Research Symposium
October 22, 2019; WPI; Worcester, MA
Sound Fusion: A concert of music made by humans and robots at Mechanics Hall
October 27, 2019; Mechanics Hall; Worcester, MA
Human-Robot Improvisation with DanceNite
August 23-24, 2019; VLA Dance – Bare Velvet; Cambridge, MA
Machinic Tides Installation
July 16-29, 2019; Worcester PopUp; Worcester, MA
Experiment in Augmentation 2 Performance
June 21, 2019; Make Music Day Worcester; Worcester PopUp; Worcester, MA
Tempo Mecho Performance at ICMC / NYCEMF 2019
June 18, 2019; The Loreto Theater – The Sheen Center; New York, NY
Music Robots at Touch Tomorrow
June 8, 2019; Touch Tomorrow; WPI; Worcester, MA
Machine Rhythm Study No. 2 Performance at SMC
May 30, 2019; Sound and Music Computing Conference; Málaga, Spain
DanceNite Improvisation
May 11, 2019; Dorchester Art Project; Boston, MA
Tempo Mecho Performance at SEAMUS
March 21, 2019; Berklee College of Music; Boston, MA
Mixing AI with Music & Literature
March 4, 2019; by Jessica Messier; The Herd
Experiment in Augmentation Performance at NYC EIS
February 23, 2019; NYC Electroacoustic Improvisation Summit; Voorhees Theater; Brooklyn, NY
Performance of Riley’s In C by Hinge and Musical Robots
December 7, 2018; performed by The Hinge Ensemble and the musical robots of WPI’s MPR Lab; Sonic Currents; Spaulding Hall, WPI; Worcester, MA
Musical Robotics Presentation at The Boston Museum of Science
December 8, 2018; The Boston Museum of Science; Boston, MA
Rise of a City Performance at Sonic Currents
December 7, 2018; Hinge Ensemble; Spaulding Hall, WPI; Worcester, MA
Human-Robot Improvisation at HUB week
October 12, 2018; Boston, MA
Stylistic Alchemies Review: Massimo Ricci. Touching Extremes.
Stylistic Alchemies Review: Avant Scena
Through the Rain
2 channel recording
Scott Barton – guitar
December, 2017
Through the Rain started with a chord progression written on guitar many years ago. The progression, played on electric guitar, is reflected in a number of virtual instruments and effects, blurring the line between acoustic and electronic, played and sequenced. The first section of the work illustrates movement from dislocation -> synchronization (an idea also interpreted in Eroding Mountains). In the second section of the piece, the parts have coalesced and begin to move as a unified whole. This journey takes a variety of paths, retakes steps, and encounters ephemeral electronic weather systems. The storms and disorientation relent as it arrives, leaving a new idea, connected to the previous, that announces idyllic perpetuation.
Carried by Currents
for flute, oboe, clarinet, bassoon and electronics
August 2017
In the process of creating carried by currents, I imagined something ambient to balance the rather frenetic and fragmented character of other recent works. I felt motions that were smooth and rolling; that rippled through a surface in a variety of directions. I sought a hybrid character that radiated warmth in a tactilely organic and fuzzily electric way. I conceived a texture that was complex and heterogeneous but also whole. I envisioned something that metaphorically extended to both water and electricity, to human and technology, to illustrate the complexity and peace of the flows around us.
Experiment in Augmentation 2
for human improviser, PAM (robotic string instrument), Cyther (human-playable robotic zither) and robotic percussion
2018
Experiment in Augmentation 2 features a human-robot improvisation consisting of a human improviser and the musical robots PAM, percussive aerophone, and percussion built by WPI’s Music, Perception and Robotics Lab and EMMI. The robots respond to human-produced cues with algorithmically-generated statements. Their performative idiosyncrasies transform idealized pitch, rhythm and velocity information. The human performer nudges the machines in particular directions and pulls them back if they have become too adventurous. He indicates which gestures should persist, which should be recalled and which ones should be developed further by the machines. The human is thus both composer and conductor as the compositional process unfolds in performance. Enabling human control of higher-level musical elements (i.e. meter, rhythmic subdivisions, pitch set) and machine control of lower-level ones (e.g. pitch, temporal position) allows the performer’s attention to shift and roam, and thus highlights a way in which human expressive abilities can be augmented via physical computing technologies.
Mechatronic Expression: Reconsidering Expressivity in Music for Robotic Instruments
S. Kemper, S. Barton (2018). In Proceedings of the 18th International Conference on New Interfaces for Musical Expression (NIME). Blacksburg, VA.
A Robotic Percussive Aerophone
K. Sundberg, S. Barton, A. Walter, T. Sane, L. Baker, A. O’Brien (2018). In Proceedings of the 18th International Conference on New Interfaces for Musical Expression (NIME). Blacksburg, VA.
Experiment in Augmentation 2 Performance
June 8, 2018; The Haven; Charlottesville, VA
Experiment in Augmentation 2 Performance at NIME 2018
June 2, 2018; Moss Arts Center, VA Tech; Blacksburg,VA
Musical Robotics presentation at Music and the Brain Seminar
April 3, 2018; WPI; Worcester, MA
There’s a Persistent Hum in This Canadian City, and No One Knows Why
Rise of a City Performance at Original Gravity: Take Flight
January 20, 2018; Boston Button Factory; Boston, MA
Musical Robotics Presentation at the Boston Museum of Science
December 9, 2107; Boston, MA
Musical Robotics Presentation at HUBWeek 2017
October 15, 2017; Robot Block Party, HUBWeek 2017; Boston, MA
Music Technology Panelist at the Boston New Music Festival
September 24, 2017; Killian Hall, MIT; Cambridge, MA
Experiment in Augmentation 1 performance at Audio Mostly
August 23, 2017; Web Audio Mostly Concert; Audio Mostly Conference; 93 Feet East; London, England
Effusion Performance at the Applause Music Festival
July 15, 2017; Fort Worth, TX, USA
Experiment in Augmentation 1 performance at MUME
June 19, 2017; Musical Metacreation Concert at the Eighth International Conference on Computational Creativity; Atlanta, GA, USA
Using Recurrent Neural Networks to Judge Fitness in Musical Genetic Algorithms
P. Mitrano, A. Lockman, J. Honicker, S. Barton (2017). In proceedings of The 5th International Workshop on Musical Metacreation (MUME) at The 8th International Conference on Computational Creativity (ICCC). Atlanta, GA, USA.
Musical Robotics Presentation at Touch Tomorrow
June 10, 2017; WPI; Worcester, MA
Cyther: A human-playable, self-tuning robotic zither presentation at NIME
May 17, 2017; Aalborg University; Copenhagen, Denmark
Cyther: A human-playable, self-tuning robotic zither
S. Barton, E. Prihar, P. Carvalho (2017). Cyther: a human-playable, self-tuning robotic zither. In proceedings of The 17th International Conference on New Interfaces for Musical Expression. Copenhagen, Denmark.
Experiment in Augmentation 1 performance at NIME
May 17, 2017; Copenhagen, Denmark
Experiment in Augmentation 1
Spring 2017
In the work, a human performer, Cyther (a human-playable robotic zither) and modular percussion robots interact with each other. The interaction between these performers is enabled by both the physical design of Cyther and software written by the composer. The perceptual aspects of the system distinguish auditory events, create groupings and find patterns. In response to perceived information, the system can mimic, transform and generate material. It stores information about past events, and thus has memory, which shape the expressive choices that it makes. It is used in improvisatory contexts to illuminate unique gestures that are only possible through electromechanical actuation, which inspire a human performer to explore new expressive territory. The improvisations provide structure and freedom in order to both present the possibilities of this ensemble and allow for spontaneity. In particular, the work explores rhythms and timbres that are enabled by these machines.
Cyther
We often think of an instrument and the agent that plays it as unified. That is, we talk about a flautist or a violinist as a single thing that requires both human and instrument working symbiotically together. In other ways, performer and instrument are meaningfully distinct, and the boundary between the two is inflexible. What if this boundary is made porous, allowing the human to play the role of pseudo-static sound shaper while the instrument becomes dynamic and expressive? By integrating robotic actuation into a human-playable instrument, agency becomes amorphous and distributed as performer and machine interact through a shared medium. A human performer and the machine are able to fluidly move between the roles of impulse and filter. The robot inspires the performer with expressions made possible by mechanical actuation while the performer transforms these gestures by physically manipulating the instrument. Reciprocally, the performer can affect how the robotic system both interprets and generates ideas. The results illuminate the expressive spaces that are human, that are mechanical, that are shared between the two, and that emerge as these worlds synthesize. The actions of both become parts of a symbiotic whole, rather than self-contained instances that are co-located, thus the system exemplifies cooperative interaction. The project builds on the lineage of technology that seeks augmentation through human-machine symbiosis. The possibilities offered by such human-playable robotic musical instruments have been little explored (the vast majority of musical robots function autonomously).
Effusion performance at SEAMUS
April 20-22, 2017; St. Cloud State University; St. Cloud, MN
Systematic Variation in Rhythm Production as Tempo Changes
Barton, S., Getz, L., & Kubovy, M. (2017). Systematic Variation in Rhythm Production as Tempo Changes. Music Perception: An Interdisciplinary Journal, 34(3), 303-312.
We investigated the effect of tempo on the production of the syncopated 3-2 son clave rhythm. We recorded eleven experienced percussionists performing the clave pattern at tempi ranging from 70 bpm to 210 bpm. As tempo increased, percussionists shortened the longest intervals and lengthened the shortest interval towards an intermediate interval that is located in the first and second positions in the pattern. This intermediate interval was stable across tempi. Contrary to prior studies, we found that the complexity of interval ratios had little effect on production accuracy or stability and the “short” interval in the pattern was not particularly stable. These results suggest that as tempo is varied, (1) experienced musicians systematically distort rhythmic intervals, (2) rhythmic configuration, and not just the complexity of interval ratios, affects the production of rhythmic intervals, and (3) the distinction between long and short intervals is context-dependent.
Effusion
Summer-Fall 2016
2 channel recording; 5:59
In “Pleasure Beats: Rhythm and the Aesthetics of Current Electronic Music”, Ben Neill describes how popular and art music are distinguished along rhythmic lines. He predicts a future music where such distinctions are less clear; where the rhythmic vernacular of pop music is spoken in artistic territory. Much of my recent creative compositional efforts, including this piece, exemplify movement in this direction, both in terms of rhythm as well as other musical elements. Here, a foundation is set with materials from the pop world: verse-chorus form, 4/4 time signatures, fuzzy synth basses, rock grooves and EDM breaks. These materials are then manipulated in electroacoustic-art-music ways: timbres are transformed, grains are made and re-ordered, meters are changed, and earlier materials are restated in discontinuous sequences. The result is less a fusion and more a congeries where non-ironic choruses and dizzying jump-cuts cohabitate. These combinations are not motivated by a desire to influence the language of art music for its own sake, rather, they are expressions of cultural heterogeneity that is not compartmentalized.
Musical Robots Presentation at the Boston Museum of Science
December 10-11, 2016; Boston, MA
Systematic Variation in Rhythm Production as Tempo Changes at APCAM 2016
November 17, 2016; Poster presentation at the 15th Annual Auditory Perception, Cognition and Action Meeting; Boston, MA;
Opus Palladianum: Voice and Drums Performance at Festival Ecos Urbanos
November 16, 2016; Mexico City, Mexico
Human-Robot Improvisation Performance at Opensound Concert Series
November 12, 2016; Third Live Studio; Somerville, MA;
Nate Tucker – percussion,
Cyther (human-playable robotic zither), Robotic Percussion
Musical Robotics Presentation and Human-Robot Improvisation Performance
October 7, 2016; Franklin Pierce University; Rindge, NH
Creativity in the Generation of Machine Rhythms
S. Barton (2016). In proceedings from The 1st Conference on Computer Simulation of Musical Creativity.
Abstract: This paper explores musical, psychological and philosophical ideas about how humans and machines interact in creative processes. It argues that creativity is a function of both generator and receiver, and that these roles can be amorphous in the creation and experience of electronic music. It offers an approach to structuring temporal spaces for rhythmic composition, which leads to the idea of machine rhythms, which are proposed as a promising area for creative expression.
Rise of a City Performance at Original Gravity Concert Series
August 11, 2016; Original Gravity All-Star Concert at Rising Tide; Portland, ME
Opus Palladianum: Voice and Drums Performance at the VU Symposium
July 6, 2016; Park City, UT
“Bots and Beer” – The Daily Herd
June 14, 2016; by Julia Quinn-Szcesuil
“New Music Instruments Hackathon: Meet the Speakers” – I CARE IF YOU LISTEN
June 9, 2016
Creativity in the Generation of Machine Rhythms presentation at CSMC
June 18, 2016; 1st Conference on Computer Simulation of Musical Creativity; University of Huddersfield; UK
Performance at Original Gravity Concert Series
Rise of a City; Human-Robot Improvisation: Cyther; Water, Rhythm and Light (premiere) and Human-Robot Improvisation: Pattern performed
June 17, 2016; Mystic Brewery; Chelsea, MA
Musical Robots Presentation at Touch Tomorrow
June 11, 2016; WPI; Worcester, MA
Human-Robot Improvisation: Cyther Performance at New Musical Instruments Hackathon
June 11, 2016; Spotify HQ; NY, NY
Musical Robotics Presentation at New Musical Instrument Hackathon
June 11, 2016; Spotify Headquarters; NY, NY
Original Gravity Podcast: Rise of a City
Join Original Gravity Artistic Director Keith Kirchoff and Public Relations Guru Greg Carlson (both exceptional brewers) as they design a unique, homebrewed beer that will aesthetically pair with Scott Barton’s Rise of a City, a piece for musical robot and guitar featured on our Summer 2016 Mystic Brewery concert.
Composing Perception in Interactive Musical Machines Presentation
February 15, 2016; Expressive Engines: Musical Technologies from Automata to Robots Symposium; Rutgers University; New Brunswick, NJ
Human-Robot Improvisation: Cyther Performance at Expressive Engines Concert
Febrary 15, 2016; Expressive Engines Concert; Rutgers University; New Brunswick, NJ
Intersections
for flute, clarinet, piano, violin, cello, tom-tom, PAM (robotic string instrument), and robotic percussion
Fall 2014 – Spring 2015
commissioned by the Juventas New Music Ensemble
Machine expressivity is often thought of as involving precision, speed, rhythmic complexity, non-idiomatic (for human performers) pitch patterns and replication. Human expressivity is often thought of involving groove, phrasing, affect, contour, variation, articulation, entrainment and communication. While these attributes help shape our conceptions of what is human versus what is mechanical, they are not confined to one category or the other: humans can be precise and robots can groove. Expressive identity is more analog than digital. This does not preclude expressive spaces that are unique to humans and machines, rather, it suggests the areas between them are ambiguous and that the attributes that define them do not do so in a one-to-one fashion (instead, attribute-space relationships are a function of combination and context). The music explores these areas of ambiguity and clarity. Genre is treated in a similar way such that stylistic exemplars are presented authentically and in transformation. The intersections in expressive identity and style illuminate what is exclusive and what is shared.
Human-Robot Improvisation: Cyther performance at SEAMUS
Febrary 12, 2016; SEAMUS National Conference; Carter Recital Hall, Georgia Southern University; Statesboro, GA
Rise of a City
2009
for guitar and robotic ensemble (PAM, MADI and CADI)
produced and recorded by Scott Barton, mixed by Marc Urselli and Scott Barton at East Side Sound Studios, NYC
Rise of a City introduces a human performer to the robotic creations of EMMI (Expressive Machines Musical Instruments, expressivemachines.com) for the first time. The piece features complementary string parts (one played by a human guitarist, one played by the robotic string instrument PAM) that are supported by a robotic percussion ensemble. The piece explores mechanical gestures, human expression, virtuosity and synchronicity by placing specific musical ideas in a variety of instrumental and temporal spaces. Material is soloed and shared between the parts, giving us a sense of the unique expressive characteristics of human versus robotic instrumentalists. From the perspective of narrative, the musical interactions between human and machine can be understood through the metaphor of how ideas develop. Sometimes multiple groups of people simultaneously cultivate ideas towards similar goals even though they don’t live in the same place. Sometimes the paths of this race are parallel. Sometimes they diverge. When they diverge, the separation can result in either an alternate route to the original goal or a new path(s) that clears the way to previously unimagined possibilities. This has become a familiar phenomenon to us via technological innovation, scientific discovery, stylistic innovation and the construction of physical communities (dwellings → cities). The construction of physical communities has particular metaphoric weight in the case of this piece. From small beginnings a city exudes reiterative processes in multiple directions. New neighborhoods spring up that incorporate and / or react to adjacent areas. The restatements are accumulative, so that the entirety of the city becomes perpetually more massive and complex. At the same time, the most recent individual additions, buildings in the case of a city, mirror the qualities of the whole in terms of grandeur and intricacy. This path is not purely linear of course, and the ability to start simply, small-ly, or differently is always preserved.
Musical Robots Presentation at the Boston Museum of Science
December 12-13, 2015; Boston Museum of Science; Boston, MA.
Intersections Performance at MIT
November 21, 2015; Project Fusion; Killian Hall, MIT; Cambridge, MA. Performed by the Juventas New Music Ensemble.
Intersections performance at Composing the Future concert
November 13, 2015; Spaulding Hall, WPI; Worcester, MA. Performed by the Juventas New Music Ensemble.
Dysrhythmia of timed Movements in Parkinson’s disease and freezing of gait
Tolleson, C. M., Dobolyi, D. G., Roman, O. C., Kanoff, K., Barton, S., Wylie, S. A., … & Claassen, D. O. (2015). Dysrhythmia of timed movements in Parkinson׳ s disease and freezing of gait. Brain research, 1624, 222-231.
ABSTRACT
Human-Robot Improvisation and Life’s Node performance at the Worcester Art Museum
October 1, 2015; Worcester, MA; Nate Tucker – percussion
Opus Palladianum: Voice and Drums performance at ICMC
September 27, 2015; ICMC (International Computer Music Conference); Denton, TX
Life’s Node Performance at the Boston Center for the Arts
September 26, 2015; Boston, MA; improvisation with Cyther (human-playable robotic zither) and Nate Tucker – percussion
Opus Palladianum: Voice and Drums performance at Si15
August 23, 2015; Si15 2nd International Symposium on Sound and Interactivity; Singapore.
Human-Robot Improvisation and Presentation at NYC New Musical Instruments Hackathon
July 24, 2015; Spotify Headquarters; NY, NY
MARIE: Monochord-Aerophone Robotic Instrument Ensemble
T. Rogers, S. Kemper, S. Barton (2015). In proceedings from The 15th International Conference on New Interfaces for Musical Expression.
Abstract
Musical Robotics Presentation at Touch Tomorrow
June 13, 2015; WPI; Worcester, MA
MARIE: Monochord-Aerophone Robotic Instrument Ensemble paper presented at NIME
June 3, 2015; T. Rogers, S. Kemper, S. Barton. Paper presented by the second author at The 15th International Conference on New Interfaces for Musical Expression; Baton Rouge, LA.
Intersections performance at Music in Motion concert
May 5 and 6, 2015; Club Oberon; Cambridge, MA. Performed by the Juventas Ensemble.
Eroding Mountains performance at Sonic Voyages: MayDay concert
May 1, 2015; University of Bristol; Bristol, UK
Eroding Mountains performance at SEAMUS
March 26-28, 2015; SEAMUS (Society for Electro-Acoustic Music in the United States) Conference; Virginia Tech; Blacksburg, VA
Extended Abstract for Drum Circle
S. Barton, S. Kemper (2015). Published in UTS ePRESS; March, 2015.
The paper was presented at the International Conference on Social Robotics 2014. link to paper