Guest lecture: Learning as Performance

Swedish artist-developer Alexander Berman presents his latest research.

AI dance

AI_am. Artificial intelligence meets dance performance (Photo: Alexander Berman)

Title
Learning as Performance: Autoencoding and Generating Dance Movements in Real Time

Abstract
We describe the technology behind a performance where human dancers interact with an "artificial" performer projected on a screen. The system learns movement patterns from the human dancers in real time. It can also generate novel movement sequences that go beyond what it has been taught, thereby serving as a source of inspiration for the human dancers, challenging their habits and normal boundaries and enabling a mutual exchange of movement ideas. It is central to the performance concept that the system's learning process is perceivable for the audience. To this end, an autoencoder neural network is trained in real time with motion data captured live on stage. As training proceeds, a "pose map" emerges that the system explores in a kind of improvisational state. We show how this method is applied in the performance, and share observations and lessons made in the process.

Alex BermanBio
Alexander Berman is an artist and software developer based in Gothenburg and working at the intersection of art, technology and research.

Published Apr. 25, 2018 4:46 PM - Last modified Nov. 30, 2020 12:17 AM