Some interesting ideas here:
- A set composition which is moved forward by live triggers.
- Using an acoustic instrument as a controller. Possibly using pitch detection.
- Using video, text and spoken word for “conceptional content” in combination with the abstraction of music/sound.
- Integration of poetry medium was nice.
I wasn’t so much impressed with the actual choice of content of the piece but the delivery and the fast pace of the musical style combined was very effective. I imagine live it would have a great impact assuming you were standing close enough.
He seemed stuck behind the instrument and sheet music so would be physically constrained in meeting the audience and thus have less options with engaging them.
Ideas for future directions from here might be:
- Gesture recognition, i.e. series of notes, a “music phrase” as triggers for navigating the composition, perhaps allowing for non-linear compositions. Also for triggering smaller sequences of visuals/spoken-word
- Different interface to make music which allows more interaction with the audience. Perhaps wireless sensors.