Skip to main content Accessibility help
×
×
Home

Software Agents in Music and Sound Art Research/Creative Work: current state and a possible direction

  • Ian Whalley (a1)
Abstract

Composers, musicians and computer scientists have begun to use software-based agents to create music and sound art in both linear and non-linear (non-predetermined form and/or content) idioms, with some robust approaches now drawing on various disciplines. This paper surveys recent work: agent technology is first introduced, a theoretical framework for its use in creating music/sound art works put forward, and an overview of common approaches then given. Identifying areas of neglect in recent research, a possible direction for further work is then briefly explored. Finally, a vision for a new hybrid model that integrates non-linear, generative, conversational and affective perspectives on interactivity is proposed.

Copyright
Corresponding author
E-mail: musik@waikato.ac.nz
References
Hide All
Anumba, C., Ren, Z. Ugwu, O. (eds.) 2005. Agents and Multi-Agent Systems in Construction. London: Taylor & Francis.
Aronson, J., Liang, T. Turban, E. 2004. Decision Support Systems and Intelligent Systems (7th ed.). Englewood Cliffs, NJ: Prentice Hall.
Balogh, Z., Budinska, I., Dang, T. T., Hluchy, L., Laclavik, M. Nguyen, T. 2002. Agent Platforms Evaluation and Comparison. Institute of Informatics, Slovak Academy of Sciences. Pellucid 5FP IST-2001-34519. Viewed online: http://ups.savba.sk/~giang/publication?tr-2002-06.pdf (accessed 17 April 2009).
Bigus, J. P. Bigus, J. 2001. Constructing Intelligent Agents In Java (2nd ed.). New York: John Wiley & Sons.
Biles, J. 2003. GenJam in Perspective: A Tentative Taxonomy for GA Music and Art Systems. Leonardo 36(1): 4345.
Beyls, P. 2007. Interaction and Self-Organisation in a Society of Musical Agents. Proceedings of ECAL 2007 Workshop on Music and Artificial Life. Lisbon, Portugal.
Booker, C. 2004. The Seven Basic Plots. Why We Tell Stories. London: Continuum.
Brown, A. 2002. Opportunities for Evolutionary Music Composition. Proceedings of Australasian Computer Music Association Conference. Melbourne, Australia: 27–34.
Brown, A., Livingstone, S. Muhlberger, R. 2007. Controlling Musical Emotionality: An Affective Computational Architecture for Influencing Musical Emotions. Digital Creativity 18(1): 4353.
Brown, S. Parsons, L. 2008. The Neuroscience of Dance. Scientific America (July): 5863.
Brown, S., Merker, B. Wallin, N. 2001. The Origins of Music. London: The MIT Press.
Bryant, G. Hagen, E. 2003. Music and Dance as a Coalition Signalling System. Human Nature 14(1): 2151.
Cai, T., Wei-Ying, M., Wang, C., Zhang, C. Zhang, L. 2007. MusicSense: Contextual Music Recommendation using Emotional Allocation Modelling. Proceedings of the 15th International Conference on Multimedia. Augsburg, Germany: 553–6.
Camurri, A. 1993. Applications of Artificial Intelligence Methodologies and Tools for Music Description and Processing. In G. Haus (ed.) Music Processing. Madison, WI: A-R Editions, 233266.
Chadabe, J. 1996. The History of Electronic Music as a Reflection of Structural Paradigms. Leonardo Music Journal 6: 4144.
Chadabe, J. 2004. Electronic Music and Life. Organised Sound 9(1): 36.
Chen, C. Kiss, J. 2003. Setting Up a Self-Organized Multi-Agent System for the Creation Of Sound and Visual Virtual Environments within the Framework of a Collective Interactivity. International Computer Music Conference. Singapore: 11–14.
Cohen, A. 2001. Music as a Source of Emotion in Film. In P. Juslin and J. Sloboda (eds) Music and Emotion: Theory and Research. Oxford: Oxford University Press, 249272.
Consoli, A., Ichalkaranje, N., Jarvis, B., Phillips-Wren, G., Tweedale, J. Sioutis, C. 2006. Innovations in Multi-agent Systems. Journal of Network and Computer Applications 30: 10891115.
Cope, D. 1991. Computers And Musical Style. Madison, WI: A-R Editions.
Coutinho, E., Miranda, E. R. da Silva, P. 2005. Evolving Emotional Behaviour for the Expressive Performance of Music. Lecture Notes in Computer Science. London: Springer-Verlag: 497.
Detlor, B. Serenko, A. 2002. Agent Toolkits: A General Overview of the Market and an Assessment of the Instructor Satisfaction with Utilizing Toolkits in the Classroom. Working Paper 455. School Of Business, McMaster University, Hamilton, ON.
Dorin, A. 2001. Generative Processes and the Electronic Arts. Organised Sound 6(1): 4753.
Drogoul, A., Meurisse, T. Vanbergue, D. 2002. Multi-Agent Based Simulation: Where are the Agents? Proceedings of MABS’02 (Multi-Agent Based Simulation) Bologna, Italy, LNCS, Springer-Verlag.
Edwards, M., Murray-Rust, A. Smaill, D. 2006. MAMA: An Architecture for Interactive Musical Agents. Frontiers in Artificial Intelligence and Applications 141: 3640.
Eigenfeldt, A. 2008. Emergent Rhythms through Multi-agency in Max/MSP. In R. Kronland-Martinet, S. Ystad and K. Jensen (eds.) Computer Music Modeling and Retrieval: Sense of Sounds. Copenhagen: CMMR, 368379.
Gabrielsson, A. 2001. Emotion in Strong Experiences with Music. In P. Juslin and J. Sloboda (eds.) Music and Emotion: Theory and Research. Oxford: Oxford University Press, 441449.
Gabrielsson, A., Lindstrom, E. 2001. The Influence or Musical Stucture on Emotional Expression. In P. Juslin and J. Sloboda (eds.) Music and Emotion: Theory and Research. Oxford: Oxford University Press, 223248.
Gang, D., Goldman, C. Rosenschein, J. 1999. NegNet: A Connectionist-Agent Integrated System For Representing Musical Knowledge. Annals of Mathematics and Artificial Intelligence 25: 6990.
Gimenes, M., Miranda, E. R. Johnson, C. 2005. A Memetic Approach to the Evolution of Rhythms in a Society of Software Agents. Proceedings of the 10th Brazilian Symposium on Computer Music (SBCM). Belo Horizonte, Brazil.
Gimenes, M., Miranda, E. R. Johnson, C. 2006. The Development of Musical Styles in a Society of Software Agents. Proceedings of the International Conference on Music Perception and Cognition. Bologna.
Graugaard, L. 2006. Implicit Relevance Feedback in Interactive Music: Issues, Challenges, and Case Studies. Proceedings of the 1st International Conference on Information Interaction, Copenhagen, Denmark. ACM International Conference Proceeding Series 176: 119–28.
Hashimoto, S. 1998. KANSEI as the Third Target of Information Processing and Related Topics in Japan. Proceedings of International Workshop on Kansei – Technology of Emotion, Tokyo, Japan: 101–4.
Hashimoto, Y., Kurihara, S., Legaspi, R., Moriyama, K. Numao, M. 2007. Music Compositional Intelligence with an Affective Flavor. Proceedings of the 12th International Conference on Intelligent User Interfaces. Honolulu: 216–24.
Homer, H., Chen, H., Lin, Y. Su, Y. 2007. Music Emotion Recognition: the Role of Individuality. Proceedings of the International Workshop on Human-centered Multimedia. Augsburg: 13–22.
Juslin, P. 2001. Communicating Emotion in Music Performance: A Review of Theoretical Framework. In P. Juslin and J. Sloboda (eds.) Music and Emotion: Theory and Research. Oxford: Oxford University Press, 309377.
Juslin, P. Sloboda, J. (eds.) 2001. Music and Emotion: Theory and Research. Oxford: Oxford University Press.
Kon, F. Ueda, L. 2004. Andante: Composition and Performance With Mobile Musical Agents. International Computer Music Conference. Miami, FL 1–6 November: 604–7.
Landy, L. 1999. Reviewing the Musicology of Electroacoustic Music: A Plea for Greater Triangulation. Organised Sound 4(1): 6170.
Li, C. Shan, M. 2007. Emotion-based Impressionism Slideshow with Automatic Music Accompaniment. International Multimedia Conference Proceedings of the 15th International Conference on Multimedia, Augsburg, Germany, 839–42.
Malt, M. 2004. Khorwa: A Musical Experience with Autonomous Agents. International Computer Music Conference. Miami, FL 1–6 November: 59–62.
Mithen, S. 2005. The Singing Neanderthals. The Origins of Music, Language, Mind and Body. London: Weidenfeld & Nicolson.
Miranda, E. R. 2003. A-Life and Musical Composition. A Brief Survey. IX Brazilian Symposium on Computer Music. Campinas, Brazil.
Miranda, E. Gimenes, M. 2008. An A-Life Approach to Machine Learning of Musical Worldviews for Improvisation Systems. Proceedings of 5th Sound and Music Computing Conference. Berlin.
Nakamura, K., Numao, M. Takagi, S. 2002. CAUI Demonstration – Composing Music Based on Human Feelings. Proceedings of American Association for Artificial Intelligence. www.aaai.org.
Nakayama, L., Wulfhorst, R. Vicari, R. 2003. A Multi-Agent Approach for Musical Interactive System. AAMAS’03, Australia: 584–91.
Nussbaum, C. 2007. The Musical Representation. Meaning, Ontology, and Emotion. London: MIT Press.
Paine, G. 2002. Interactivity, Where to From Here? Organised Sound 7(3): 295304.
Patel, A. 2008. Music, Language and the Brain. Oxford: Oxford University Press.
Peretz, I. 2001. Listen to the Brain: A Biological Persective on Musical Emotions. In P. Juslin and J. Sloboda (eds.) Music and Emotion: Theory and Research. Oxford: Oxford University Press, 105134.
Quinlan, R. n.d. FOIL http://www.rulequest.com/Personal.
Rowe, R. 1994. Interactive Music Systems: Machine Listening and Composing. London: MIT Press.
Schubert, E. 2001. Continuous Measurement of Self-Report Emotional Response to Music. In P. Juslin and J. Sloboda (eds.) Music and Emotion: Theory and Research. Oxford: Oxford University Press, 393414.
Scherer, K. 2004. Which Emotions Can be Induced? What are the Underlying Mechcanisms? And How Can We Measure Them? Journal of New Music Research 33(3): 239251.
Scherer, K. Zentner, M. 2001. Emotional Effects of Music: Production Rules. In P. Juslin and J. Sloboda (eds.) Music and Emotion: Theory and Research. Oxford: Oxford University Press, 361392.
Senge, P. 1992. The Fifth Discipline. The Art and Practice of the Learning Organization. New York: Random House.
Smalley, D. 1997. Spectromorphology: Explaining Sound-shapes. Organised Sound 2(2): 107126.
Spicer, M. 2004. AALIVENET: An Agent Based Distributed Interactive Composition Environment. International Computer Music Conference. Miami, FL 1–6 November: 201–204.
Spicer, M., Tan, B. T. G. Tan, C. 2003. The Learning Agent Based Interactive Performance System. International Computer Music Association Conference. Singapore: 95–9.
Thaut, M. 2005. Rhythm, Music, and the Brain: Scientific Foundations and Clinical Applications. London: Routledge.
Wang, M., Zhang, N. Zhu, H. 2004. User-adaptive Music Emotion Recognition. 7th International Conference on Signal Processing. Beijing: 1352–5.
Weale, R. 2006. Discovering How Accessible Electroacoustic Music Can Be: The Intention/Reception Project. Organised Sound 11(2): 189200.
Weinberg, G. 2005. Interconnected Musical Networks: Toward a Theoretical Framework. Computer Music Journal 29(2): 2339.
Weiss, G. (ed.) 2000. Multi-Agent Systems, a Modern Approach to Distributed Artificial Intelligence. Cambridge, MA: MIT Press.
Whalley, I. 2001. Applications of System Dynamics Modelling to Computer Music. Organised Sound 5 (3) 2000: 149157.
Whalley, I. 2002. Kansei in Non-linear Automated Composition. Semiotic Structure and Recombinicity. 11th International Symposium on Electronic Arts. Nagoya: 62–5.
Whalley, I. 2004. Adding Machine Cognition to a Web-Based Interactive Composition. International Computer Music Conference. Miami, FL 1–6 November: 197–200.
Whalley, I. 2005. Software Agents and Creating Music/Sound Art: Frames, Directions, and Where to From Here? Proceedings of International Computer Music Conference. Barcelona, 5–9 September: 691–5.
Winkler, T. 1998. Composing Interactive Music: Techniques and Ideas Using MAX. Cambridge, MA: MIT Press.
Wishart, T. 2002. On Sonic Art, ed. S. Emmerson. London: Routledge.
Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Organised Sound
  • ISSN: 1355-7718
  • EISSN: 1469-8153
  • URL: /core/journals/organised-sound
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed