Skip to content

Gestural Sound Toolkit in Max/MSP for easy and fast Gesture-to-Sound scenario prototyping

Notifications You must be signed in to change notification settings

ircam-ismm/Gestural-Sound-Toolkit

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gestural Sound Toolkit V2

Toolkit in Max/MSP for gesture-to-sound scenario prototyping. General documentation and article

Install

Clone the repo:

git clone --depth=1 https://github.com/ircam-ismm/Gestural-Sound-Toolkit.git 

or download the zip

Gestural Sound Toolkit V2 is a Max package that must be copied in the Max Package folder

The toolkit makes use of the MuBu lib developed by the ISMM team at IRCAM.

This a Max/MSP library that is free avaialble as Max Package (in the Package manager), or freely available at the Forum Ircam

https://forum.ircam.fr/projects/detail/mubu/ (necessary to register to the IRCAM forum [spam-free])

THe MuBu package must be either copied in the Max Package folder, or referenced in the Files Preference:

Options > File Preferences > +

Note for Windows' users: you need to install Visual Studio 2015 Redistributable Package to have MuBu working!

Credits

V2 is a fork of the Gestural Sound Toolkit https://github.com/bcaramiaux/Gestural-Sound-Toolkit a Max library for the design of Embodied Sonic Interactions

V2 Contributors:

  • STMS Lab IRCAM CNRS Sorbonne Université (Frédéric Bevilacqua, Riccardo Borghesi, Diemo Schwarz, Victor Paredes)
  • ISIR Sorbonne Université (Baptiste Caramiaux,)
  • LIMSI CNRS (Jules Françoise)
  • University of York (Alessandro Altavilla)

V1 Contributors:

  • AVI Group Goldsmiths College (Baptiste Caramiaux, Alessandro Altavilla)
  • IRCAM-Centre Pompidou (Mubu: Riccardo Borghesi, Diemo Schwarz, Norbert Schnell, Frédéric Bevilacqua, Jules Françoise) EAVI website: eavi.goldsmithsdigital.com, (c) 2015 EAVI Group, Goldsmiths College, University of London

This toolkit has been designed and implemented in order to investigate research questions on sonic interaction design, and led to a publication at the CHI 2015 conference. If you use the toolkit in your research or teaching, please credit our work:

Caramiaux, Baptiste, Altavilla, Alessandro, Pobiner, Scott and Tanaka, Atau. "Form Follows Sound: Designing Interactions from Sonic Memories". In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. pp. 3943-3952. 2015 (http://dx.doi.org/10.1145/2702123.2702515)

Acknowledgments

V2: ELEMENT (ANR-18-CE33-0002), element-project.ircam.fr

V1: GST has been developed as part of the MetaGestureMusic project, which received funding from the European Research Council under the European Union’s Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement n. FP7-283771.

Original authors: Baptiste Caramiaux, Alessandro Altavilla

About

Gestural Sound Toolkit in Max/MSP for easy and fast Gesture-to-Sound scenario prototyping

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Max 68.8%
  • JavaScript 31.2%