Skip to main content Accessibility help
×
Home
Hostname: page-component-768ffcd9cc-7jw6s Total loading time: 0.457 Render date: 2022-12-03T08:47:48.584Z Has data issue: true Feature Flags: { "useRatesEcommerce": false } hasContentIssue true

Generative Music for Live Performance: Experiences with real-time notation

Published online by Cambridge University Press:  13 November 2014

Arne Eigenfeldt*
Affiliation:
School for the Contemporary Arts, Simon Fraser University, Vancouver, Canada
*

Abstract

Notation is the traditional method for composers to specify detailed relationships between musical events. However, the conventions under which the tradition evolved – controlled relationships between two or more human performers – were intended for situations apart from those found in electroacoustic music. Many composers of electroacoustic music have adopted the tradition for mixed media works that use live performers, and new customs have appeared that address issues in coordinating performers with electroacoustic elements. The author presents generative music as one method of avoiding the fixedness of tape music: coupled with real-time notation for live performers, generative music is described as a continuation of research into expressive performance within electroacoustic music by incorporating instrumentalists rather than synthetic output. Real-time score generation is described as a final goal of a generative system, and two recent works are presented as examples of the difficulties of real-time notation.

Type
Articles
Copyright
© Cambridge University Press 2014 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

Ames, C. 1989. The Markov Process as a Compositional Model: A Survey and Tutorial. Leonardo 22(2): 175187.CrossRefGoogle Scholar
Appleton, J. 1999. Reflections of a Former Performer of Electroacoustic Music. Contemporary Music Review 18(3): 1519.CrossRefGoogle Scholar
Ariza, C. 2005. Navigating the Landscape of Computer-Aided Algorithmic Composition Systems: A Definition, Seven Descriptors, and a Lexicon of Systems and Research. Proceedings of the International Computer Music Conference, Barcelona, 765–72.Google Scholar
Baird, K. 2005. Real-Time Generation of Music Notation Via Audience Interaction Using Python and GNU Lilypond. Proceedings of the 2005 Conference on New Interfaces in Musical Expression, Vancouver, 240–1.Google Scholar
Brown, A. 2005. Generative Music in Live Performance. Proceedings of the Australasian Computer Music Conference, Brisbane, 23–6.Google Scholar
Chadabe, J. 1984. Interactive Composing. Computer Music Journal 8(1): 2227.CrossRefGoogle Scholar
Cole, H. 1974. Sounds and Signs: Aspects of Musical Notation. London, New York and Toronto: Oxford University Press.Google Scholar
Cont, A. 2011. On the Creative Use of Score Following and Its Impact on Research. SMC 2011: 8th Sound and Music Computing Conference.Google Scholar
Cope, D. 1991. Computers and Musical Style. Madison, WI: A-R Editions.Google Scholar
Dannenberg, R. 1984. An On-Line Algorithm for Real-Time Accompaniment. Ann Arbor, MI: MPublishing, University of Michigan Library.Google Scholar
Dean, R. 2003. Hyperimprovisation: Computer Interactive Sound Improvisation. Madison, WI: A-R Editions.Google Scholar
Didkovsky, N. and Burk, P. 2001. “Java Music Specification Language” An Introduction and Overview. Proceedings of the International Computer Music Conference, Havana, 123–6.Google Scholar
Didkovsky, N. and Hajdu, G. 2008. MaxScore: Music Notation in Max/MSP. Proceedings of the International Computer Music Conference, Belfast, 483–6.Google Scholar
Eigenfeldt, A. 1989. ConTour: A Real-Time MIDI System Based on Gestural Input. Proceedings of the International Conference of Computer Music. San Francisco: ICMC.Google Scholar
Eigenfeldt, A. 2012. Corpus-Based Recombinant Composition Using a Genetic Algorithm. Soft Computing: A Fusion of Foundations, Methodologies and Applications 16(12): 2049–56.Google Scholar
Eigenfeldt, A. and Pasquier, P. 2010. Realtime Generation of Harmonic Progressions Using Controlled Markov Selection. Proceedings of the First International Conference on Computational Creativity (ICCCX), 16–25.Google Scholar
Eigenfeldt, A., Bown, O., Pasquier, P. and Martin, A. 2013. Towards a Taxonomy of Musical Metacreation: Reflections on the First Musical Metacreation Weekend. Proceedings of the Artificial Intelligence and Interactive Digital Entertainment (AIIDE’13) Conference, Boston.Google Scholar
Emmerson, S. 1991. Computers and Live Electronic Music: Some Solutions, Many Problems. Proceedings of the International Computer Music Conference. San Francisco: ICMC, 135–8.Google Scholar
Freeman, J. 2008. Extreme Sight-Reading, Meditated Expression, and Audience Participation: Real-Time Music Notation in Live Performance. Computer Music Journal 32(3): 2541.CrossRefGoogle Scholar
Freeman, J. 2010. Web-Based Collaboration, Live Musical Performance and Open-Form Scores. International Journal of Performance Arts & Digital Media 6(2): 149170.CrossRefGoogle Scholar
Freeman, J. and Colella, A. 2010. Tools for Real-Time Music Notation. Contemporary Music Review 29(1): 101113.CrossRefGoogle Scholar
Galanter, P. 2003. What Is Generative Art? Complexity Theory as a Context for Art Theory. International Conference on Generative Art, Milan, Italy.Google Scholar
Gann, K. 2006. The Music of Conlon Nancarrow. Cambridge: Cambridge University Press.Google Scholar
Garnett, G. 2001. The Aesthetics of Interactive Computer Music. Computer Music Journal 25(1): 2133.CrossRefGoogle Scholar
Gee, F., Browne, W. and Kawamura, K. 2005. Uncanny Valley Revisited. IEEE International Workshop on Robots and Human Interactive Communication . Nashville, TN: IEEE Press, 151157.Google Scholar
Grey, J. and Moorer, J. 1977. Perceptual Evaluations of Synthesized Musical Instrument Tones. The Journal of the Acoustical Society of America 62(2): 454462.CrossRefGoogle Scholar
Gutknecht, J., Clay, A. and Frey, T. 2005. GoingPublik: Using Realtime Global Score Synthesis. Proceedings of the 2005 Conference on New Interfaces for Musical Expression, Singapore, 148–51.Google Scholar
Hajdu, G. 2005. Quintet.net: An Environment for Composing and Performing Music on the Internet. Leonardo 38(1): 2330.CrossRefGoogle Scholar
Hajdu, G. 2007. Playing Performers. Proceedings of the Music in the Global Village Conference, Budapest, 41–2.Google Scholar
Hajdu, G. and Didkovsky, N. 2009. On the Evolution of Music Notation in Network Music Environments. Contemporary Music Review 28(4–5): 395407.CrossRefGoogle Scholar
Hamel, K. 2006. Integrated Interactive Music Performance Environment. Proceedings of the 2006 Conference on New Interfaces for Musical Expression, Paris, pp. 380–3.Google Scholar
Hiller, L. 1981. Composing with Computers: A Progress Report. Computer Music Journal 5(4): 721.CrossRefGoogle Scholar
Kapur, A., Darling, M., Murphy, J., Hochenbaum, J., Diakopoulos, D. and Trimpin, 2011. The KarmetiK NotomotoN: A New Breed of Musical Robot for Teaching and Performance. Proceedings of the 2011 Conference on New Instruments and Musical Expression, Oslo, 228–31.Google Scholar
Kim-Boyle, D. 2005. Musical Score Generation in Valses and Etudes. Proceedings of the 2005 International Computer Music Conference, Barcelona, 810–12.Google Scholar
Kim-Boyle, D. 2006. Real Time Generation of Open Form Scores. Proceedings of Digital Art Weeks, ETH Zurich.Google Scholar
Lewis, G. 2000. Too Many Notes: Computers, Complexity and Culture in Voyager. Leonardo Music Journal 10: 3339.CrossRefGoogle Scholar
McAllister, G., Alcorn, M. and Strain, P. 2004. Interactive Performance with Wireless PDAs. Proceedings of the 2004 International Computer Music Conference, Miami, 702–5.Google Scholar
McClelland, C. and Alcorn, M. 2008. Exploring New Composer/Performer Interactions Using Real-Time Notation. Proceedings of the International Computer Music Conference. San Francisco: ICMC, 176–9.Google Scholar
McNutt, E. 2003. Performing Electroacoustic Music: A Wider View of Interactivity. Organised Sound 8(3): 297304.CrossRefGoogle Scholar
Nienhuys, H. and Nieuwenhuizen, J. 2003. LilyPond: A System for Automated Music Engraving. Proceedings of the XIV Colloquium on Musical Informatics, 167172.Google Scholar
Pachet, F. 2004. Beyond the Cybernetic Jam Fantasy: The Continuator. Computer Graphics and Applications, IEEE 24(1): 3135.CrossRefGoogle ScholarPubMed
Putman, D. 1990. The Aesthetic Relation of Musical Performer and Audience. The British Journal of Aesthetics 30(4): 361366.CrossRefGoogle Scholar
Risset, J. and Mathews, M. 1969. Analysis of Musical-Instrument Tones. Physics Today 22(2): 2330.CrossRefGoogle Scholar
Romero, J., Machado, P., Santos, A. and Cardoso, A. 2003. On the Development of Critics in Evolutionary Computation Artists. In Applications of Evolutionary Computing, LNCS 2611, Springer-Verlag, 559–69.Google Scholar
Rowe, R. 1993. Interactive Music Systems. Cambridge, MA: MIT Press.Google Scholar
Sloboda, J. 1996. The Acquisition of Musical Performance Expertise: Deconstructing the ‘Talent’ Account of Individual Differences in Musical Expressivity. In K. Ericsson (ed.), The Road to Excellence: The Acquisition of Expert Performance in the Arts and Sciences, Sports, and Games. Hillsdale, NJ: Lawrence Erlbaum, 107126.Google Scholar
Stone, K. 1980. Music Notation in the Twentieth Century. New York: W.W. Norton.Google Scholar
Truax, B. 2014. Combining Performers with Soundtracks: Some Personal Experiences. www.sfu.ca/~truax/live.html, accessed March 24, 2014.Google Scholar
Vaughan, M. 1994. The Human-Machine Interface In Electroacoustic Music Composition. Contemporary Music Review 10(2): 111127.CrossRefGoogle Scholar
Vercoe, B. and Puckette, M. 1985. Synthetic Rehearsal: Training the Synthetic Performer. Ann Arbor, MI: MPublishing, University of Michigan Library.Google Scholar
Whitelaw, M. 2004. Metacreation: Art and Artificial Life. Cambridge, MA: MIT Press.Google Scholar
Winkler, G. 2004. The Realtime-Score: A Missing Link in Computer-Music Performance. Proceedings of Sound and Music Computing Conference, Paris.Google Scholar
Wulfson, H., Barrett, G. D. and Winter, M. 2007. Automatic Notation Generators. Proceedings of the 7th International Conference on New Interfaces for Musical Expression, New York, 346–51.Google Scholar

Eigenfeldt supplementary movie

Movie 1

Download Eigenfeldt supplementary movie(Video)
Video 113 MB

Eigenfeldt supplementary movie

Movie 2

Download Eigenfeldt supplementary movie(Video)
Video 46 MB

Eigenfeldt supplementary movie

Movie 3

Download Eigenfeldt supplementary movie(Video)
Video 152 MB
3
Cited by

Save article to Kindle

To save this article to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Generative Music for Live Performance: Experiences with real-time notation
Available formats
×

Save article to Dropbox

To save this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your Dropbox account. Find out more about saving content to Dropbox.

Generative Music for Live Performance: Experiences with real-time notation
Available formats
×

Save article to Google Drive

To save this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you used this feature, you will be asked to authorise Cambridge Core to connect with your Google Drive account. Find out more about saving content to Google Drive.

Generative Music for Live Performance: Experiences with real-time notation
Available formats
×
×

Reply to: Submit a response

Please enter your response.

Your details

Please enter a valid email address.

Conflicting interests

Do you have any conflicting interests? *