In a music informatics reading group I attend when I can, we discuss a paper by David Huron, entitled Tone and Voice: A Derivation of the Rules of Voice-leading from Perceptual Principles. Here are a few random thoughts and reflections arising from my reading of this paper.
Much of it unfortunately went right over my head, largely due to my ignorance of some important music concepts. However...
The toneness principle in my view may well give us an idea on music relevance e.g. how users may perceive and identify music. This is image identification with sounds or as Huron puts it 'sounds may be regarded as evoking perceptual images'. Initially I interpreted this at rather a high level, but the reading group pointed out to me that actually the sounds are a low level e.g. a passing car, bells ringing - in other words single sound. Which doesn't in any way go against my argument. It would be an interesting project to see how or if users do in fact associated sound with images (if it hasn't already been done) e.g. the way I associate Beethoven's pastoral symphony with the countryside or Dvorak's new world symphony with the North of England (Hovis ad). In my view high or low level identification of sound with images could provide us a link to music relevance.
Later on he talks about the principle of temporal continuity, which further reinforced my views. Huron states 'auditory images have a tendency to linger beyond the physical cessation of the stimulus'. The examples he gives examples of timpani rolls or bubbling brook. A sound invoking an image which stays with you. More evidence of image identification with sound if not music? Perhaps the length of sound of the instrument may have some influence here (the longer the sound, the more likely it is to invoke an image to the user).
Perhaps music can evoke motion as well as images. Huron states that 'it would seem that the sense of continuation between two tones is an auditory analog to apparent motion in vision'. The example he gives is some research using two lamps that could be switched on and off. The 'sense of apparent motion depends on the distance separating the to lamps and their speed of switching'. If may me speculate on the effect two instruments (say) moving in and out of voices could invoke movement, and hence contribute to our understanding of music relevance.
In Fig. 17 of the paper a classification of music is given based on a two dimensional graph of onset synchronisation vs semblant motion (melodies moving in the same direction). It was speculated in the reading group that you could use this as a filter for query by example e.g. pick collections of music in some space on this graph and then use user generated knowledge (via meta-data) to further refine the query.
And now for something really quite enjoyable. The Huron paper mentions the Hurdy Gurdy, an instrument I'm quite interested in, and I found this video of a performance by Melissa Kacalanos on the New York subway:
Marvellous, truly marvellous! She has her own Web page. This however is just bloody funny! What this man can do with his hands is nothing short of amazing.
LOL! Bohemian Rhapsody will never sound the same again. He actually gives a tutorial as well:
Have at try....
Huron, D. (2001). Tone and Voice: A Derivation of the Rules of Voice-leading from Perceptual Principles. Music Perception, 19(1) pp1-64.