An approach to simplifying theory
Here is an insight into my approach to simplify music theory. Of course, before recording was invented, the only way to share music was via music notation. So all the detail was present to the last minutiae. For many, this means the wood (musical concepts) cannot be seen for the trees (the embodiment of the concepts in the music). But these days, technology offers alternatives…
Additional to my love affair with the guitar and music and improvisation over the last few decades, the following three areas fascinate me. They totally guide the design of emuso and its interactive content, to reveal the simplicity inherent in music theory.
- Music psychology and learning
- How to minimise complexity in system design
- User experience design
(Emuso provides many more benefits to musicians such as the ability to quickly create and acquire practice regimes, but these aren’t discussed here)
Music psychology and learning
This is mind-blowing. I first got into this when I wanted to know how pitch was defined. That led to me looking up perception, and that started a journey of exploration into how on earth air molecules moving backwards and forwards that enter our ears end up with our brains emotional response and appreciation of music as it unfolds ?
Some very interesting facts come out from all the research that’s been done. For example, the brain absolutely hates randomness … it’s always looking to make sense of the environment, and this has some obvious consequences for musicians, especially those that are starting their musical journey. In particular, the sooner that the use of musical phrasing is appreciated, and how to properly use the notes in scale, the better. It can be very demotivating otherwise, from trial-and-error attempts. I will be exploring this more in an up-and-coming blog about musical structure.
Other very interesting observations are on how all children learn. Their first experiences after birth are all about touch and sound. Then vision comes along. Then speech develops imitating their parents. Reading has to wait. Children also have a good sense of rhythm (in fact our bodies are all about rhythms and synchronisation, from breathing, to heartbeat, to circadian rhythms. Children recognise melodies, and certainly know if their favourite melody gets changed or sung out of tune by their mother. They can memorise tunes, and sing them.
What does all this tell us? We all (unless physically or cognitively impaired) have an innate sense of hearing musical relationships (intervals and larger musical ideas built from intervals (phrases, verses, songs, sections of an opus) and an innate sense of rhythm.
I heartily recommend the following two books, that can almost be read like a novel (there are many others, more academic ones):
- Music, the brain, and ecstasy. Robert Jourdain.
- Sweet anticipation. David Huron
Applying lessons from system design to explaining music theory
I’ve spent decades working in the design and architecture of many types of complex system software (CAD/CAM, computer languages and compilers, computer graphics, device emulation, virtual machines, embedded systems …). So I learned to appreciate a design is easier to understand and maintain when each different part of the design takes care of some specific job, with unneccessary detail removed, and with as little knowledge of the other parts as possible.
Having studied music theory and applied it for decades, I thought life can be a lot easier for us musicians by taking a similar approach, combined with using widely familar visualisations. In the context of music, maintenance means keeping foundational musical concepts present all the time.
Huge simplifications arise from:
- Removing note names when discussing note combinations, and instead use intervals.
- Present intervals both as semitones (obvious mapping to instrument) and using theoretical names (major 3rd etc)
- Removing note names when discussing musical relationships (e.g. chord to scale), and instead use intervals
- Removing rhythm from any discussion about note combinations, initially.
- Removing actual notes when discussing rhythm, where appropriate.
- Visualising rhythm using rectangles denoting duration
This is where using digital technology shines, as these simplifications are all entirely possible when we make use of computer graphics, sound, and multimedia.
For example, we can visualise interval combinations extremely simply, by using a very common item … a clock face (or watch face). Below, G major scale is laid out along the bass E string. The red circle on the guitar denotes the scale start note. It appears twice. The numbers labelling the various scale members show their distances in semitones from the nearest scale start note below the scale member.
Below the guitar neck is a clock face, with times 0 (midnight) to 11 AM. Notice how the pattern of coloured circles on the clock (“occupied times”) and “unoccupied times” are mirrored on the guitar neck. The clock times also has 12 spokes around it, with orange-coloured spokes at times 0, 4 and 7 AM, indicating a chord rooted at 0 AM (scale start note) found in the scale.
User experience design
UX design is critical. I remember seeing a wonderful picture that captured the essence of UX design. It showed a baby in a cot, with a bunch of small fluffy animal toys hanging from a horizontal ring that the parents had put together. It then showed the view as seen by the baby … a bunch of animal rear-ends! There’s a lesson there.
Here’s some of the ways we’ve addressed the UX for helping you get familair with musical concepts. Firstly, the design is a mix of visuals and sounds. It carries theoretical information such as interval and note names and scale structure that can be be changed with a key command. This is just one aspect of functionality available very simply (usability). Menu design gives access to the usual things such as scales and chords. Simple key commands let you visualise scales in different regions (similar to CAGED) and the mouse can be used to draw out a retangular area to lay out the current scale or chord in. By holding down the ALT- key over a scale note on the guitar, you hear and see the appropriate chord rooted at that note. Content-wise, the design enables interactive multimedia lesson and practise content, for testing the understanding of musical concepts on-instrument, with a virtual machine that knows how to correct mistakes on-instrument. While the design separates rhythm and notes, it also enables the design of pure rhythmic concepts, that can with a single click be filled with a note pattern from the instrument.
There is also a fully-fledged video-based help system, which when turned on, shows relevant help for a button, icon, menu item and so on.
By applying the simplifications mentioned above, in particular avoiding (or de-emphasising) note names, and avoiding mixing rhythm with notes when discussing note combinations and musical relationships, a lot of clarity is gained, which can be taken to the instrument on day one (with the obvious issue of lack of dexterity at that time). In particular, intervals are better appreciated if presented using semitones between the notes involved, rather than their theoretical names initially. This means liteally anyone can create a scale on-instrument in literally minutes without knowing a single note name or where to locate them. (S)he can hear and play what’s being presented, and start experimenting,on day one. That has proven to be very motivating and encourages learning more. Technology allows all these simplifications to be made. This approach can then be used alongside traditional methods including notation, if wanted.