Improv: Live coding for robot motion design

Alexandra Q. Nilles, Chase Gladish, Mattox Beckman, Amy LaViers

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Often, people such as educators, artists, and researchers wish to quickly generate robot motion. However, current toolchains for programming robots can be difficult to learn, especially for people without technical training. This paper presents the Improv system, a programming language for high-level description of robot motion with immediate visualization of the resulting motion on a physical or simulated robot. Improv includes a "live coding" wrapper for ROS ("Robot Operating System", an open-source robot software framework which is widely used in academia and industry, and integrated with many commercially available robots). Commands in Improv are compiled to ROS messages. The language is inspired by choreographic techniques, and allows the user to compose and transform movements in space and time. In this paper, we present our work on Improv so far, as well as the design decisions made throughout its creation.

Original languageEnglish (US)
Title of host publicationProceedings of the 5th International Conference on Movement and Computing, MOCO 2018
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450365048
DOIs
StatePublished - Jun 28 2018
Event5th International Conference on Movement and Computing, MOCO 2018 - Genoa, Italy
Duration: Jun 28 2018Jun 30 2018

Publication series

NameACM International Conference Proceeding Series

Other

Other5th International Conference on Movement and Computing, MOCO 2018
Country/TerritoryItaly
CityGenoa
Period6/28/186/30/18

Keywords

  • Choreography
  • Haskell
  • Human-robot interaction
  • Live coding
  • ROS
  • Robotics
  • Roshask

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Vision and Pattern Recognition
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Improv: Live coding for robot motion design'. Together they form a unique fingerprint.

Cite this