The Making of "Wildlife"

 

David A. Jaffe and W. Andrew Schloss

                                   

D. A. Jaffe, Box 4268 Stanford, CA 94309 USA

email: daj@ccrma.stanford.edu

 

W. A. Schloss, School of Music, University of Victoria, Victoria BC CANADA V8W 2Y2 

email: aschloss@nero.uvic.ca

 

 

This paper describes some of the issues that came up in the composition of Wildlife, an improvisationally-oriented interactive computer piece.  We address the question of what makes a good "machine partner" and how the duo ensemble situation can be expanded by the use of configurable instruments and computers.

 

Introduction

 

Wildlife is a computer-extended duo in five movements for Mathews/Boie Radio Drum and Zeta violin.  It was co-composed by Jaffe and Schloss and was premiered by the composers in Victoria, Canada in 1991.  It has been performed numerous times in North America and Europe and is planned to be released on an upcoming CDCM compact disc of music featuring the Radio Drum. 

 

The name "Wildlife" is used in two senses.  First, it refers to the improvisational nature of the work.  All materials are generated in direct response to the performers' actions and there are no pre-recorded or stored sequences.  Furthermore, the malleable nature of the instruments allows the traditional boundaries that separate one instrument from another to be broken down.  As a simple example, the violinist's glissando may change the pitch of chords played by the percussionist.  Allowing the computer a degree of autonomy takes the performers a further step away from the customary ensemble relationship.  Thus, they find themselves "living on the wild side."

 

The autonomy provided the computers gives rise to the second sense in which the name "Wildlife" applies.  Robotic expert Hans Moravec describes, in his recent work Mind Children, a world in which autonomous artificial life forms breed, propagate, compete and interact.   These life forms can be beneficial, parasitic or benign.   In Wildlife, the computers spawn independent processes that suggest such beings.  

 

Technical Description

 

The Mathews/Boie Radio Drum is a sensor capable of reporting accurately the position of two mallets in three dimensions.  It generates no sound; the effect of a performed gesture is entirely determined by software.  The Zeta violin is a solid-body electric violin with both a MIDI output and an amplified electronic sound.  Each string has its own pickup and pitch detector, allowing for independent pitch bend for each string. 

 

The system configuration consists of both the Zeta violin and the Drum passing information to a Macintosh IIci computer, which does preliminary gestural processing and passes MIDI information to a SampleCell sample player and a NeXT computer. The NeXT does further gestural processing and algorithmic music generation, performs synthesis on the NeXT's built-in DSP chip, and sends MIDI to a Yamaha TG77 synthesizer.  The software on the Macintosh is based on the Max system, while the software on the NeXT is based on Ensemble and the NeXT Music Kit. 

 

In addition to the Radio Drum and the Zeta violin, both performers have a number of foot pedals and switches.  Of particular interest is the percussionist's 18-key chromatic bank of organ-style velocity-sensitive foot pedals.  These pedals are used for numerous kinds of control information rather than playing pitches, however, we have found that the standard "black and white key" orientation facilitates locating the  footswitches quickly and easily.

 

Plan of Attack

 

We began work on the piece by exploring a wide range of interactive scenarios.  We quickly found that many were unsatisfactory for various reasons.  In particular, as improvising performers, we found it essential to feel that our actions had a discernable and significant effect on the music being produced.  In many situations, one or the other of us felt that his influence was inconsequential.  The situation was analogous to playing a solo in a jazz ensemble with insensitive accompanists.  The manner in which we felt we needed to be able to exert influence could be very different for each of us in a given situation.   Yet, there was definitely an intangible but undeniable difference between those situations where we felt our improvisational imagination was fired and those where we felt it inhibited by opaque complexity. 

 

Visceral Learning

 

It was difficult to predict whether or not an interactive scenario imagined on paper would turn out to be effective in performance.  The only way we could decide what worked well was by long hours performing with each setup, searching for material that seemed to complement it, and exploring its potential.  Though we both understood fully the logic being executed by the computer programs, it was only by exploration as performers that we discovered hidden aspects.  Thus, the music for each of the five movements was developed in parallel with the interaction scheme. 

 

As a simple example, in the fourth movement, the violinist supplies the pitches that make up the percussionist's improvisation.  The percussionist can choose to play recently-played pitches or can go back in time to pitches played earlier.  In this context,  the violinist plays only occasionally and in such a manner as to change the flow of the ongoing music.   This movement was particularly difficult for him because the effect of material he played was evident only some time later when the percussionist played these pitches.  Yet, an implementation detail turned out to supply the answer.  It turns out that the "remembered" pitches played by the violinist are stored in a buffer which is not circular.  Thus, every hundred notes (this number was at first set arbitrarily), the buffer would be empty and would start to be refilled again.  This quirk turned out to provide just the "foot in the door" that the violinist needed.  By playing tremolo, he could fill up the whole buffer with a single pitch and constrain the percussionist to that pitch.   The implementation also guaranteed that every now and then the percussionist would be forced to play only very-recently performed pitches.    Thus, what started out as an arbitrary irrelevant constraint turned out to be an asset in disguise.   "It's not a bug it's a feature!"

 

Musical Cowboys

 

When we gave the computer a large degree of autonomy, the major problem became how to avoid the feeling the music was "getting away from us."   One way to deal with this problem is to consider the performer's role as analogous to that of a conductor of a piece that doesn't use strict rhythmic coordination between parts.  In such a piece, the conductor gives signals that control the large-scale flow of the music, but without specifying the individual details.  To use a more colorful analogy, the independent computer processes are like cattle that are allowed to wander over the open plains and the performer's control is that of the cowboy who reigns them in when it's time to go into the corral. 

 

As an example, in the third movement, the computer is generating melodic material based on the pitches played by the violinist, but transposing those pitches to any octave and using fractal shapes to derive melodic, rhythmic and dynamic contours.  The percussionist can exert control over this process by changing the upper and lower range of the computer process, by changing whether or not repeated notes are tied, and by changing the tempo.  The computer process also can be made to follow players' dynamics.  The violinist can change the density of the computer process by playing repeated notes.  Thus, the computer processes can be allowed to wander freely and then be reigned in suddenly in response to the performer's  actions. Nevertheless, the process is interesting and complex enough that the computer often seems to have a mind of its own.  We are never sure how it is going to behave and are often surprised (and sometimes perplexed) by its seeming whimsy.

 

Crossing Boundaries

 

Traditional instruments have clear boundaries. They may play in unison, combining to produce a new timbre; they may combine harmonically or contrapuntally.  But each performer is in control of the sound produced by his instrument and is the sole determiner of the notes he will play and when he will play them.  In contrast, with "virtual instruments" (controllers) like the Radio Drum and controller-instrument hybrids like the Zeta Violin, the traditional boundaries between performers can become like permeable membranes.  As a simple example, the violinist's glissando can change the pitch of notes produced by the percussionist and the percussionist can control the loudness of the electric violin sound.  This situation is analogous to the humorous trick performed by bluegrass bands in which the guitarist reaches his left arm around the banjo player and plays the chords of the banjo while the banjo player does the picking, and at the same time, the banjo player reaches his left arm around the guitarist's back and plays the chords for his instrument. 

 

The crossing of boundaries can happen on a larger formal level as well.  In the fifth movement, the percussionist plays arpeggiated consonant chords.  The root of each chord can be specified by either the percussionist or the violinist.   The violinist specifies the root by stepping on a pedal.  The next note he plays sets the new root.  If he uses the pedal only occasionally, the harmony changes slowly and is consonant.  However, if he uses the pedal frequently, on every note for example, the effect is much more complex and dissonant.  Since the percussionists' chords are transposed according to the octave of the violin note, the violinist also has control over the percussionist's range.  At the same time, the percussionist has a similar pedal and can surprise the violinist by changing the harmony out from under him, but in a way intimately related to what the violinist has just played.  Thus, the harmony emerges as a result of a complex improvisational interaction. 

 

An Example In Detail

 

The first movement begins with a simple interaction scheme, allowing the audience to perceive the causality between performed action and resulting synthesizer sound.  The violin, in addition to its acoustic sound, produces pitches via a "chord mapping set", defined as twelve "chord mappings."  A chord mapping is a chord that is produced when a particular pitch class is performed and transposed in a manner corresponding to the performed octave.  The Drum selects which of several chord mapping sets is active. As an example, one set might produce chords derived from chromatic tone clusters while another might produce a different octave-displacement for each pitch.

 

The Drum's horizontal axis controls register, the vertical axis controls duration and the height above the surface controls loudness.  The surface is also partitioned in half, with one part of the Drum playing chords, and the other playing single notes.  Overlaying this partition is a grid that the percussionist uses to select the active chord mapping set.  Thus the familiar gesture of striking the drum can have the unfamiliar result of changing the harmonization of the violinist's melody, an effect usually considered in the realm of composition rather than performance.

 

Another interesting aspect of this movement from an ensemble standpoint is that both performers are playing the same synthesized sound at the same time, resulting in ambiguity as to who does what and enabling one player to "pull the rug out from under" the other player.

 

Summary

 

Our experience with Wildlife shows that improvisational ensemble music and interactive instruments can be a powerful combination.  The traditional inviolability of a performer's sole control over his instrument can be relaxed and the degree of invasion of one performer's control over the other's instrument can be controllable as a musical parameter.  Adding to this situation semi-autonomous computer processes that the players can control in the manner of a conductor further enriches the environment.

 

However, to discover an effective interactive scenario, it is necessary to spend a good deal of time playing with the system and learning its idiosyncrasies.   Improvisation is ideal for allowing this to occur, since it lets the performer react spontaneously to the musical situation.  No amount of programming skill and cleverness is a substitute for the process of using the system in a musical context. 

 

Acknowledgements 

 

Thanks to visionary instrument builders Max Mathews and Bob Boie (Radio Drum) and Keith Mcmillian (Zeta violin.)  Thanks also to Michael McNabb, whose Ensemble program has been enormously useful in this work, Julius Smith, who co-developed the NeXT Music Kit with the author, and Miller Puckette and David Zicarelli, whose Max program proved extremely valuable. 

 

References

 

Boie, R. and L. Ruedisueli and E. Wagner.   "Gesture Sensing via Capacitive Moments." 

            Work  Project No 311401-(2099,2399) AT&T Bell Laboratories, 1989.

Jaffe, D.  and W. A. Schloss.  "The Computer-Extended Ensemble." Computer Music Journal, 

            MIT  Press, Winter 1992 (forthcoming.) 

Jaffe, D.  "The Computer-Extended Ensemble." LULU, Buneos Aires, Argentina, Autumn 1992.

Mathews, M. and A.Schloss.  "The Radio Drum as a Synthesis Controller."  Proceedings of the

            1989 ICMC, Columbus, Ohio. 

McNabb, M.  "Ensemble, an Interactive Performance and Composition Environment." 

            Proceedings of the 1990 Audio Engineering Society Conference, L.A.,CA.

Moravec, H.  Mind Children.  Harvard University Press, 1988.

Puckette, M.  "Amplifying Musical Nuance"  JASA, Spring 1990.

Schloss, W. A.  "Intelligent Musical Instruments:  The Future of Musical Performance or the

            Demise of Virtuosity in Music?",  Proceedings of the International Conference on

            Man/Machine Interaction, CNUCE, Pisa, Italy, 1991.

Schloss, W. A.  "Recent Advances in the Coupling of the Language Max with the Mathews/Boie

            Radio Drum ."  Proceedings of the 1990 ICMC, Glasgow.

Smith, J. and D. Jaffe and L. Boynton. "Music System Architecture on the NeXT Computer."

            Proceedings of the 1989 Audio Enginnering Society Conference, L.A., CA.

Zicarelli, D. "Writing External Objects for MAX."  OpcodeSystems, 1991.