One of the things we discussed last class was using instrument sounds to mimic speech/dialogue. An example of this is in Steve Reich's WTC 9/11 (although personally I don't like this piece very much, and there's a lot of ethical baggage in this piece, but that's a loaded topic I won't get into!). But I think the idea of how he uses instruments to play in rhythm with recordings of human speech is interesting. We were thinking of taking this a step further and using live processing to alter the instrument sounds themselves to trigger random, seemingly unrelated noises which could then be played in time to match a dialogue in the narrative in the video accompanying the performance.
The specific idea we discussed earlier about having instrument sounds or key clicks trigger other noises is explored more in the track below (which Matthew actually recommended me to listen to last class!). Instead of playing regular trumpet sounds, Mark Kirschenmann triggers other sounds using Logic that each correspond to different notes that he plays on his trumpet.
Another idea we talked about was having a piece based on short really fragmented and contrasting ideas mushed together, so that the audience almost gets "whiplash" immediately moving from one random fragment to another. The piece below by John Zorn ("Cat O' Nine Tails") actually doesn't have electronics in it, but it's completely based off of this fragment idea, moving through completely different styles almost every ten seconds or so.