Week of 10/10/17:
We came up with a couple possible directions for our project. The first involves live visual art, that will be generated in real time. We began by noting how we enjoyed how therapeutic drawing on MS Paint was, so we found a website called "Flock Draw" that is a similar online program that allows people to draw in real time. The special thing about Flock Draw is that multiple people can access the same online canvas at the same time, meaning that people in different locations can each interact and add to the same drawing simultaneously. We would then improvise sounds using our instruments extended with electronics to the live art that was being created. It is possible that we could give a theme to help guide the artwork being made. The video below is an example of live art with sound. In this example, the artist is responding to the music improvisations as well as the musicians responding to the art. This is an interesting idea that we can also try with our piece.
Another idea that we had was use some sort of narrative that we would use and improvise along with. The text could be a voice-over of anything - we had tossed around the idea of using a recipe or another similar mundane text. This example shows a story that is read with accompanying electronic and improvised music that goes along with it.
Finally, one thing that seemed to work well when we were improvising together was to improvise on a general theme. For us, making animal noises seemed to work the best for us, so we thought of doing a improvisation piece on animal sounds. The video below includes electronic improvisations with bird sounds.
In each of these, we would extend our instruments electronically. We experimented with using Logic, but we felt that this did not give us each enough control on the electronic effects that we were generating since we only would have one interface that Kyle would have to control by himself. We are now experimenting with using Ableton Pushes, so that we each can control our own electronic sounds that we are using individually. This would allow us more control on our improvisation so that we are better able to respond to each other and the external element (visual art,text, etc.)
Week of 10/3/17:
We met last Wednesday (9/27/17) to discuss what kinds of music and projects we were generally interested in. Kyle shared a project video involving oscilloscope-generated music, in which the audio signal is fed directly into the oscilloscope, and vector graphics are drawn with sound.
Among the videos Matthew shared were several electric bassoon videos by Paul Hanson, who uses improvisation and forms of electronics to extend the instrument, both of which we are interested in pursuing in our project.
I shared a video that I found earlier of another Alexander Schubert piece entitled "Weapon of Choice." The piece involved a solo violinist that triggered live electronic sounds either through the sound of what was being played or through the movements of the bow. In other performances of the piece, background visuals also were generated based on movements of the bow. We were interested in exploring some of these live-triggered sounds and visuals in our project.
We also met on Sunday (10/1/17) in the Davis Studio to explore improvising together and to learn how to set up the room. We learned how to set up the sound system so that sound could come through groups of different speakers, and we learned how to set up various kinds of microphones that we could use when improvising. We also experimented with live processing of our instruments' sounds through Logic. We realized during this that if we wanted to be able to process all three of our instruments, we would need to use more than one interface to do so (we may also possibly need to use Ableton Live on one of them since we only have one computer with Logic among the three of us).
Week of 9/26/17:
So far, we have discussed a couple of ideas that we want to explore. We talked about using live processing to trigger alternate electronically generated sounds by playing either key clicks or regular notes. We also talked about creating a video or some kind of visual that would accompany the audio. One possible use for these electronically generated sounds that we discussed earlier was to have them mimic dialogue in the video narrative. Certain sounds could be used for each character in the narrative. Another musical idea we discussed was to possibly use short disjunct fragments that would create something completely new when combined, or to create a video and performance that was very ironic (e.g. 80s music videos). Perhaps we can see if we can find a way to combine/edit some of these ideas.
Our timeline can be found at the following link:
In addition, you can find our group charter here: