(Research Collated over the course of the year)
This class has been based on using current technology in different ways and means to create music. This way of working tends to lend itself towards very forward thinking avant-garde genres and styles of music, mixing digital processing and programming equipment with physical instruments and interfaces or digital instrument synthesis. Also the use of digital or analogue effects to add to or effect an overall performance
This style of music is not always highly technologically based, John Cage was a real pioneer of the genre with his use of texture and instrumentation to create Avante Garde music. This is an example of how off the wall his work could get.
This is an example of a physical performance of Avant-garde music. There is no digital editing or effects, just simply sound made by a physical action. I believe that a lot of avant-garde music stemmed from the pioneering acts of John Cage.
This is a short video of a current Avant-Garde artist called Hauschka, it explains how the artist fed from John Cages way of thinking and how he approaches his work.He also makes reference to how he purely uses acoustic sounds without any digital effecting or synthesis. I believe the fact that he makes a point of this in particular shows how it can be assumed that much of this style of music in current times is produced by using digital instruments, effects or synthesis. I think it seems that some artists like Hauschka take great artistic pride in the fast that they do it the natural, acoustic way but because this is such a subjective genre when it comes to overall opinion, it is hard to say whether using digital effects or processing is "wrong". I believe that the use of progressive technology in music is a great thing, it allows music to adapt to life in the time we are in and also allows for almost endless capabilities with instruments and music production. Acoustically, you can only do so much to the sound of a piano in the same way that is shown in this video. If you take this and progress it further by using digital means, it opens up a whole new world to the music which can be produced.
Examples of similar styles of digital music range from being very melodic and tuneful to being almost very hard to listen to and interpret.
This is an example of Iannis Xenakis' work. The piece involves sounds from a Jet engine, Trains, an earthquake and high, warped bell sounds. This is an example of using edited recorded sound as a composition.
It can be quite an uncomfortable experience for certain people to listen to a piece like this. The conventional idea of the term "Music" can easily be questioned when listening to a piece like this. This again seems to stem from John Cage's idea that any sound produced in any such form can be called "music". However different peoples perception of music can differ greatly from one person to the next.
Another example of Xenakis' work is the piece S.709. This is an example of using digital or electronic synthesis to create a purely electronic sound. This is caused by different electronic parameters which produce the rapidly modulating sounds.
An example of using looped tracks is "Come Out" by Steve Reich. This piece uses a short recording of a boy called Daniel Hamm who was involved in the Harlem Riots in 1964. The recording is of the boy stating "I had to, like, open the bruise up, and let some of the bruise blood come out to show them". This was to try and convince the police that he had been beaten in the riots. The loop of "Come out to show them" is played simultaneously on two channels at first, which quickly slip out of sync with each other creating a phase effect which eventually moves further and further apart to create different rhythmic effects. The two voices then split further to four, then again to eight eventually leaving an unintelligible sound to the listener. The movement of the loops creates a great tension, especially considering that the loop does not change at all throughout, only the rhythmic element of it and the thickness of the overall texture which makes it a very intense listen.
Looping systems are very effective in live situations as a track can be built from nothing to have many layers and intertwining parts. This is a great example of Bon Iver taking a single looped idea and using that one idea to create a massive piece by building on it. He also uses a vocoder for the effects of the vocal.
Using loops allows you to build on a single idea and create thicker textures by building more layers on top of a track, this is a great example of Imogen Heap using a looper live to build an entire track from scratch, with percussion elements and harmonies. She also uses the looper in such a way that she creates dynamic drops in and out as the piece progresses, almost giving the effect of a band.
For building an instrument for this class, I would like to focus on the idea of using loops and being able to build a textural element of a track in some form. I will look into how this can be done using Max and available hardware.
Building My Instrument
After deciding that I was going to follow the idea of building some form of looping instrument I looked at what was available in college to use as a physical piece of hardware to control what was happening. It could have potentially been done with only using the digital software in Max however I much prefer the physicality of moving sliders or buttons to effect a sound being produced. This makes it feel much more like an instrument.
I found the KORG Kontrol -
This is a USB MIDI device which has 9 "channels" of controls, each has two trigger buttons, a slider and a turning knob at the top. The record, play and loop buttons are not particularly useable or needed in this case and the "scene" setting I used was "2"(it was unclear what this setting actually did, I believe it changes the MIDI label of each part of the device, this will be explained further in this blog).
My plan was to use the device to control different layers of a recorded track using loops. I would have the sliders working as the "Level" of the track, the two triggers as a "Start" and "Stop" and the knobs as some form of effect, possibly tempo shift or filter effects. The most important and most difficult feature of the instrument would be that I wanted to be able to work each channel independently of the others.
This being the main issue, I started out by using the MIDI tester in Max to find out that each part of the device had it's own unique Channel Number. This meant that I could route each separate part of the device to it's own unique control using the "ROUTE" Object.
This object meant that each trigger sent by the device could control a separate part of the patch. The CTLIN object is what receives the initial MIDI trigger message from the device itself.
Once I had this problem sorted I then started to build a simple looper, this did not have to involve any form of record in tool or any audio trimming patches as I would be using pre-recorded loops from one of my original tracks called "Groovy Biscuits" the full version of which can be heard here -
https://soundcloud.com/everythingshines/groovy-biscuits-2-0
This is a shot of the looper -
The knobs on the device are routed separately to FREQSHIFT objects which allow a pass of frequencies to be moved through the audio track being played, this was a very nice effect to have on each separate track. The FREQSHIFT object throws the pitch of the melodic instruments off ever so slightly which gives it a very nice tension, also with the effect on the drums it gives it a waving phase effect which can add to the overall rhythmic feel of the track. The integer object linked to each of these is routed to the output of the knob, this is what controls each frequency.
The sound output of each of these loopers is linked to a GAIN object which is routed to each of the sliders, controlling the volume output of each track. The audio is then sent to an EZDAC, where it is then played as audio.
I tried to match the look of the patch to how the actual device is visually set out to try and create a nice moving visual when it is being played, I feel that this was a nice element to have as it meant you could see what you were doing with the device happening in real time on the screen. This is a shot of the finished product in presentation mode.
This is a short video of the final performance showing the patch in action with a brief explanation.
I am happy with how the patch turned out in the end as it works very well in context. The original idea was to create a looper interface into which any loops could be used and played to the same effect, also with the effects knob on each track, this could be changed for each track to suit what it may. I think the simplicity of the patch works very well as it could be used by someone who does not know a particular amount about using max.
This simplicity however could also be seen as a downfall. Much more could be added to the patch to give it more depth as an instrumental interface, like multi effects, live loop recording interfaces and such to make it much more complex and powerful. This would make it much more useful in a live setting as the loops could be recorded on the fly rather than the pre-recorded set up it has at the moment.