Skip to main content
×
×
Home

Music Composition as an Act of Cognition: ENACTIV – interactive multi-modal composing system

  • Miroslav Spasov (a1)
Abstract

ENACTIV is a project that addresses, explores and offers solutions for converting a performer/composer's expressive sonic and kinetic patterns into continuous variables for driving sound synthesis and processing in real-time interactive composition. The investigation is inspired by the achievements in cognitive science, in particular Umberto Maturana and Francisco Varela's Santiago Theory (1980, 1987), in which the authors explain how the process of cognition arises through ‘structural coupling’ – a mutual influence among living beings, and living beings (humans in particular) and the environment, and how this process stipulates certain patterns of organisation driving the individual's behaviour.

The project investigates how a composer/performer's cognitive archetypes, which have been developed via his or her ‘structural coupling’ with the social and natural environment and expressed through voice and unwitting hand gestures, can be associated with or ‘mapped’ onto sound synthesis and processing parameters in such a way that the system will play an active role and act reciprocally, involving a certain degree of variation and unpredictability at its output. The aim of the project is to develop a creative tool which will allow professional musicians, multi-media artists and non-expert participants to engage with multi-modal improvisation in an intuitive way.

Copyright
Corresponding author
E-mail: m.spasov@mus.keele.ac.uk
References
Hide All
Belet, B. 2003. Live Performance Interaction for Humans and Machines in the Early Twenty-First Century: One Composer's Aesthetics for Composition and Performance Practice. Organised Sound 8(3): 305312.
Bevilacqua, F., Müller, R., Schnell, N. 2005. MnM: A Max/MSP Mapping Toolbox. Proceedings of the 2005 International Conference on New Interfaces for Musical Expression (NIME05). Vancouver, Canada.
Blaine, T., Fels, S. 2003. Contexts of Collaborative Musical Experiences. Proceedings of the 2003 Conference on New Interfaces for Musical Expression (NIME03). Montreal, Canada.
Blaine, T., Perkis, T. 2000. Jam-O-Drum: A Study in Interaction Design. Proceedings of the ACM DIS 2000 Conference, August. New York: ACM Press.
Bottoni, P., Faralli, S., Labella, A., Pierro, M. 2006. Mapping with Planning Agents in the Max/MSP Environment: The GO/Max Language. Proceedings of the 2006 International Conference on New Interfaces for Musical Expression (NIME06). Paris, France.
Bown, O., Eldridge, A., McCormack, J. 2009. Understanding Interaction in Contemporary Digital Music: From Instruments to Behavioural Object. Organised Sound 14(2): 188196.
Broad, C. D. 1925. The Mind and its Place in Nature. London: Routledge & Kegan Paul.
Cadoz, C., Wanderley, M. M. 2000. Gesture-Music. In M. Wanderley and M. Battier (eds.) Trends in Gestural Control of Music. Paris: IRCAM–Centre Pompidou.
Camurri, A., Ricchetti, M., Trocca, R. 2000. Eyesweb – Toward Gesture and Affect Recognition in Dance/Music Interactive Systems. In IEEE Multimedia Systems, Florence, Italy, 1999.
Capra, F. 1996. The Web of Life. New York: Doubleday.
Cassell, J. 1998. A Framework For Gesture Generation and Interpretation. In R. Cipolla and A. Pentland (eds.) Computer Vision in Human-Machine Interaction. New York: Cambridge University Press.
Cassell, J., Scott, P. 1996. Distribution of Semantic Features Across Speech and Gesture by Humans and Computers. In Proceedings of the Workshop on Integration of Gesture in Language and Speech. Cambridge, MA: The MIT Press.
Cassell, J., Vilhjálmsson, H. 1999. Fully Embodied Conversational Avatars: Making Communicative Behaviors Autonomous. Autonomous Agents and Multi-Agent Systems 2(1): 4564.
Chomsky, N. 1972. Language and Mind. Orlando, FL: Harcourt Brace.
Cont, A., Coduys, T., Henry, C. 2004. Real-time Gesture Mapping in Pd Environment using Neural Networks. Proceedings of the International Conference on New Interfaces for Musical Expression, 2004 (NIME04). Hamamatsu, Japan.
Dawkins, R. 1989. The Selfish Gene. Oxford: Oxford University Press.
Di Scipio, A. 2003. Sound is the Interface: From Interactive to Ecosystemic Signal Processing. Organised Sound 8(3): 269277.
Dorin, A. 2001. Generative Processes and the Electronic Arts. Organised Sound 6(1): 4753.
Drummond, J. 2009. Understanding Interactive Systems. Organised Sound 14(2): 124133.
Garnett, G., Goudeseune, C. 1999. Performance Factors in Control of High-Dimensional Spaces. Proceedings of the 1999 International Computer Music Conference. San Francisco, CA: ICMA, 268271.
Gray, R. M. 1996. Archetypal Explorations. London: Routledge.
Hawkins, M. 1997. Social Darwinism in European and American Thought, 1860–1945. Cambridge: Cambridge University Press.
Hunt, A., Kirk, R. 2000. Mapping Strategies for Musical Performance. In M. M. Wanderley and M. Battier (eds.) Trends in Gestural Control of Music. Paris: IRCAM–Centre Pompidou.
Hunt, A., Wanderley, M. 2002. Mapping Performer Parameters to Synthesis Engines. Organised Sound 7(2): 97108.
Jensenius, A. R., Godøy, R. I., Wanderley, M. M. 2005. Developing Tools for Studying Musical gestures within the MaxMSP/Jitter Environment. Amsterdam: ISCM 2005.
Jung, C. G. 1958. The Archetypes of the Collective Unconscious. Princeton, NJ: Princeton University Press.
Kaltenbrunner, M., Jordà, S., Geiger, G., Alonso, M. 2006. The Reactable*: A Collaborative Musical Instrument. Proceedings of the Workshop on ‘Tangible Interaction in Collaborative Environments’ (TICE), at the 15th International IEEE Workshops on Enabling Technologies (WETICE), Manchester, UK.
Kendon, A. 1972. Some Relationships between Body Motion and Speech. An Analysis of an Example. In A. Siegman and B. Pope (eds.) Studies in Dyadic Communication. Elmsford, NY: Pergamon Press.
Koch, C. 2004. The Quest For Consciousness: A Neurobiological Approach. Englewood, CO: Roberts and Company.
Lewis, G. 1993. Voyager. Tokyo: Disk Union Avan-014.
Maturana, H., Varela, F. 1980. Autopoiesis and Cognition: The Realisation of the Living. In R. Cohen and M. Wartofsky (eds.) Boston Studies in the Philosophy of Science, vol. 42. Dordecht: D. Reidel.
Maturana, H., Varela, H. 1987. The Tree of Knowledge. Boston, MA: Shambala Publications.
McNeill, D. 1992. Hand and Mind: What Gestures Reveal About Thought. Chicago, IL: University of Chicago Press.
Miranda, E. R. 2003. A-Life and Musical Composition. A Brief Survey. IX Brazilian Symposium on Computer Music. Campinas, Brazil.
Miranda, E., Gimenes, M. 2008. An A-Life Approach to Machine Learning of Musical World views for Improvisation Systems. Proceedings of 5th Sound and Music Computing Conference. Berlin.
Miranda, E. R., Wanderley, M. 2006. New Digital Musical Instruments: Control and Interaction Beyond the Keyboard. Middleton, WI: A-R Editions.
Morales-Manzanares, R., Morales, E. F., Dannenberg, R. 2001. SICIB: An Interactive Music Composition System Using Body Movements. Computer Music Journal 25(2): 2536.
Neumann, E. 1954. The Origins and History of Consciousness, trans. R. F. C. Hull. London: Routledge & Kegan Paul.
Paine, G. 2002. Interactivity, Where to From Here? Organised Sound 7(3): 295304.
Patten, J., Recht, B., Ishii, H. 2002. Audiopad: A Tag- Based Interface for Musical Performance. Proceedings of the 2002 International Conference on New Musical Interfaces for Music Expression (NIME02). Dublin, Ireland.
Ramachandran, V. S., Blakeslee, S. 1998. Phantoms in the Brain: Human Nature and the Architecture of the Mind. London: Fourth Estate.
Rovan, J. B., Wanderley, M. M., Dubnov, S., Depalle, P. 1997. Instrumental Gestural Mapping Strategies as Expressivity Determinants in Computer Music Performance. In A. Camurri (ed.) Kansei, The Technology of Emotion. Proceedings of the AIMI International Workshop. Genoa: Associazione di Informatica Musicale Italiana: October.
Singer, E. 2005. Cyclops. Max object for analysing and tracking live video. www.cycling74.com/products/cyclops.
Spasov, M. 2009 –10a. ENACTIV, Interactive Multimodal Composing System. www.cycling74.com/share.html.
Spasov, M. 2009 –10b. Attractors Library. www.cycling74.com/share.html.
Van Nort, D., Wanderley, M. M., Depalle, P. 2004. On the Choice of Mappings Based On Geometric Properties. Proceedings of the International Conference on New Interfaces for Musical Expression (NIME04). Hamamatsu, Japan.
Wanderley, M. 2000. Gestural Control of Music. In M. Wanderley and M. Battier (eds.) Trends In Gestural Control of Music. Paris: IRCAM–Centre Pompidou.
Wanderley, M., Battier, M (eds.) 2000. Trends In Gestural Control of Music. Paris: IRCAM–Centre Pompidou.
Waters, S. 2007. Performance Ecosystems: Ecological Approaches to Musical Interaction. Proceedings of EMS: The ‘Languages’ of Electroacoustic Music (EMS07). Leicester, UK.
Wessel, D. 2006. An Enactive Approach to Computer Music Performance. In Y. Orlarey (ed.) Le feedback dans la creation musicale. Lyon: Studio Gramme.
Whalley, I. 2000. Applications of System Dynamics Modelling to Computer Music. Organised Sound 5(3): 149157.
Whalley, I. 2004. Adding Machine Cognition to a Web-Based Interactive Composition. Proeceedings of the International Computer Music Conference 1–6 November 2004. Miami, FL: ICMA, 197200.
Whalley, I. 2005. Software Agents and Creating Music/Sound Art: Frames, Directions, and Where to From Here? Proceedings of International Computer Music Conference 5–9 September 2005. Barcelona: ICMA, 691695.
Whalley, I. 2009. Software Agents in Music and Sound Art Research/Creative Work: Current State and a Possible Direction. Organised Sound 14(2): 156167.
Winkler, T. 2001. Composing Interactive Music: Techniques and Ideas Using Max. Cambridge, MA: The MIT Press.
Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Organised Sound
  • ISSN: 1355-7718
  • EISSN: 1469-8153
  • URL: /core/journals/organised-sound
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed