MUSIC OF THE SPHERES

What is music and why does it move us? Scientists mapping the activity of the brain are beginning to unravel the reasons for the universal human response to rhythm and tone

Paul Robertson
Saturday 04 May 1996 23:02 BST
Comments

Music, with its power to move and soothe, has long been recognised as a measure of civilisation. Indeed, for many previous cultures, such as the ancient Greeks, mathematics, astronomy and philosophy were all interconnected, seen as different aspects of the same knowledge. Every physical phenomenon, the Greeks believed, could be explained in terms of musical laws. Then this view of the world changed. Science and music were hived off into separate disciplines, the latter becoming part of the canon of "artistic" thought. Now, the process may be about to turn full circle. Scientists are re-discovering the fundamental importance of music to the human mind, building a bridge between disciplines.

The contemporary meeting place for music and science is in the area of brain mapping - and, in particular, the findings of modern neuro-psychiatrists about the physical basis of our musical perception. Though the scientific language is entirely new, many of the questions being asked are as old as human thought itself. What is music? Why do we have it? Is music a language? If so, what does it communicate? Why does music move us? Many of the answers lie in the inextricable connections between the evolution and anatomy of our brains and our fundamental musical responses. Our musical language is, it seems, a product of our neurology.

To consider music as a language, we must understand the functions of the brain hemispheres. Research has shown that the left half of the brain is dominant, in right-handed people, and devoted to sequential, logical thinking - verbal language. The right hemisphere views the world spatially and emotionally. Though it has virtually no verbal ability, it is highly musical. Most importantly, it invests our perceptions with meaning. (In left-handed individuals, the right hemisphere is dominant.)

Patients who have had the right hemisphere of their brain removed seem to inhabit a literal, cold, emotionless world - yet their ability to use words and think logically is unimpaired. A classic illustration is of a patient who had only a left brain. His doctor asked "How are you feeling this morning?" - and in the typical, jerky monotonous voice of such a half-brained individual, he answered, Dalek-like: "With ... my... hands."

It is broadly accepted that, for the right-handed, music is largely a right-hemisphere function. So, what remains for an individual who suffers gross left-brain damage? The case of Stephen Wade illustrates this. Until about three years ago, he was a multi-lingual international telephonist and amateur composer. Then he suffered a massive stroke in the left hemisphere of his brain, which left him wheelchair-bound and unable to use the right side of his body. Because the left hemisphere of the brain is so involved in speech and verbal language, Stephen's stroke left him bereft of words. His short-term memory is also severely impaired. Questions cannot be framed as choices - "Tea or coffee?" - because he cannot retain more than one item at a time in his mind. Stephen cannot speak, only nod or shake his head, yet he is able to use his left hand and play a keyboard fluently. Miraculously, he can pick up a pen and use it - not to write words (even his own name is impossible for him), but to write music as witty and energetic as ever.

Cases like this, together with modern technology used to trace specific areas of brain activity to particular musical skills, are enabling scientists to map the musical mind. We now know that discordant ("clashing") chords create erratic neurone firing patterns in the brain. By contrast, concordant ("harmonic") chords give rise to even neurone firing. Various studies confirm that "tuneful" traditional music appeals to our right (emotional) hemisphere and that a preference for such musical "concords" is shared by other mammals.

Measuring the brain's electrical response to music is also illuminating. All listeners have an equivalent electrical brain surge after hearing a "wrong" note or an incongruous silence in a piece of music. We are all seemingly "wired" to be musical by nature. This response to sound underlies not just musical language, but verbal communication as well.

Diana Deutsch in San Diego, California, is an expert in the field known as music cognition. She has discovered that we interpret the significance of certain combinations of notes according to where we learned our mother tongue. Cultural origin seems to dictate how individuals "hear" pairs of tri-tones - the interval obtained by dividing an octave exactly in half, so called because two notes are then three whole tones apart. What Diana Deutsch discovered was that different listeners, when presented with pairs of tri-tone chords, tended strongly to hear either a rising or a falling sequence, according to where they had learned English. Why? In fact, these chords neither rise nor fall. The answer must lie in our very earliest responses to language.

Our response to sound begins in the womb. Unlike our eyes, our ears cannot be "shut"; all sound at all times has to be interpreted. To do this we have evolved an exquisitely complex set of systems. As early as six months old, babies display highly developed abilities to recognise musical structures. Recent research suggests that music and language may sound very similar to them, because they simply hear the intonations of the voice (prosody). Even in our earliest development, sound has a special significance. Newborn babies clearly respond to particular voices and prosody - the musicality of speech. The speaking voice of adults reflects their emotional disposition. Musical significance precedes verbal.

So how does this system develop? Tony de Blois, an "idiot-savant", provides us with some clues. Brain-damaged at birth, blind and autistic, 21-year- old Tony has only just learned to make himself a sandwich and is still unable to tie his shoe laces. Yet he is an outstanding pianist, with a musical memory for over 7,000 songs and a gift for jazz improvisation.

An idiot-savant often has limited intellect but is computational (skilled at maths or dates) or has an extraordinary memory. Tony's exceptional ability became evident at the age of two, when his mother bought him an electronic keyboard. She hoped the sound might encourage him to reach out and sit up - something he had so far been unable to do.

"For the first six weeks it was hell," she remembers. "Tony simply played every possible combination of notes randomly over and over. But one day, I heard the first three notes of 'Twinkle, twinkle little star'; I went in and showed him the rest of it. There has been no looking back." At the drop of a hat, Tony can play any one of his 7,000 songs. He will leap without any clumsy transition from Bach, to Lloyd-Webber, to improvised scat. He sings as he plays incredibly complex jazz improvisations.

Through his music, Tony is expanding his interior intellectual world and his ability to relate to the outside environment, too. He can now converse fluently, and is able to relate to others in words - albeit at a somewhat literal level. Musically, he can express directly very powerful feelings. His limitations and his great gifts suggest strongly that music may endure and develop even where other skills are unfulfilled.

OUR musical responses are in part physiological, the sequence of notes drawing on rhythmic empathy for its effect - in other words, its sounds echo the natural rhythms of our own bodies. Film and television advertising and pop music is largely based on the rhythms of heart beats and breathing. It is this that makes it powerful and pleasing. From the formative period in the womb onwards, the beating heart in particular measures out the tenor of our days. So persuasive are these basic pulses that we hardly ever hear them consciously, yet they can and will directly affect our responses and our moods.

The exploration of pulse is, of course, a prime consideration for musicians (as it is in Oriental medicine) but it is only recently that it has attracted scientific appraisal. David Epstein, conductor and Professor of Music at the Massachusetts Institute of Technology has researched how these underlying rhythmic systems are expressed in music and why these tempo relationships affect us so profoundly. He concludes that the simple tempos we prefer in music are intrinsically compatible with our neurological and biological make-up. In fact, they reflect the way we process time at the most fundamental level.

Music reinforces our impression of passing time as a series of unfolding events, and structures it in unique ways. Motion, the quintessential basis of all music, is profoundly linked with emotion - hence their common semantic origin, ex motus, "from movement". In shaping music, performers shape the flow of time and they also shape emotion. By doing so they communicate with others.

This idea ties in with Dr Manfred Clynes's original research into the expressive qualities of music. Many years ago he began to explore the physiology of emotional expression - how emotion is expressed through our bodies. His theory was that an emotion must suffuse our whole system. Feeling angry, for example, is not merely a thought but a broader sensation which involves "angry use" of the whole body. We are able to recognise many such systems of expression. We all know if someone is angry without their having to tell us.

Clynes discovered that, across cultures, similar emotional states are expressed by similar micro-muscular responses. These are the building blocks of our repertoire of emotional gesture. Clynes proposed that such emotional-body gestures must underlie our body language and other ways of expressing emotions. He proposed that music - a particularly potent system of emotional expression - would use just such a repertoire of reflexes.

Using computer modelling, Clynes analysed this link between our emotional feelings and muscular movements. Working with musicians, he found that interpreting a composer successfully requires a specific expressive style - not just achieved by intellectual study of the score, but by a deeply physical muscular empathising. Applying the tonal weight and rhythmic attack of Brahms to Mozart, for example, ends not with an interpretation, but a parody. These principles he has quantified and used within his remarkable new interactive computer system.

The link between physical gesture and musical expression is being explored further at the Health Centre in San Antonio, Texas, where neuroscientist Peter Fox is looking directly into the brains of performing musician. He uses PET (positron emission tomography) scanners to provide visible cross-sections of blood flow within the brain. These scans reveal not only the activity of the motor function centres that control and remember the body movement involved in playing, but - even more remarkably - the areas of the brain that experience or create musical meaning. By comparing differences in brain activity when performing music - in this case Bach - and when playing scales, Fox and his team are getting close to defining the very essence of music. This is the cutting edge of musical science.

The information from PET scans is ascertained by a process of subtraction. For example, the brain's activity is measured when a pianist is at rest; then the pianist is asked to play scales. The difference between rest state and scale state clearly shows which parts of the brain are involved in that task - including, in this case, the motor function areas which control movement memory.

Next the pianist is asked to play a piece of Bach (chosen because it is itself largely made up of scales.) When the scale reading is compared to the Bach performance reading, we can see that those areas in the right hemisphere implicated in auditory perception are highly activated. This is the area associated with emotion, movement and meaning. Emotional qualities are being added to the physical movements of playing.

Right-handed musicians do most of their motor programming in their left brain motor areas, but the programming used to play Bach is all right- lateralised. So this "musical" content comes from the non-dominant hemisphere. Emotional qualities are literally being put into the movements of playing.

The complex motor skills involved in playing a musical instrument are interesting in themselves. Such patterned, planned and executed movements form neural pathways - measurable physical bridges in the nervous system. It is now scientifically proven that merely imagining these repetitive finger movements develops neural pathways in the same way as actually doing them. Separate research has shown that, when asked to imagine listening to tonal music, the same areas of the brain are brought into play as when actually hearing it. Intriguingly, when imagining listening to music, an area of the visual cortex also comes into play.

FOR most cultures, music, science and healing were merely different aspects of the same art. Now, modern medicine is beginning to embrace a broader view of mind and body, and science is helping to rediscover the true potency of music. In Germany, Ralph Spintge has brought music and medicine together. While most of us would accept that music might ease emotional pain, he is using it in a clinical setting with remarkable results. Dr Spintge heads a pain clinic and has now established a database on the effects of music with 90,000 patients. In between treatments, or when waiting, they can choose music which they think helps them; this is proving helpful and soothing to patients in an intimidating hospital environment. It also improves their quality and speed of recovery.

Musical pieces have also been specially composed to induce the optimum conditions, mentally and physically, for specific medical procedures. In painful operations, for example, 15 minutes of soothing music lulls the patient into a sense of well-being so that only 50 per cent of the recommended doses of sedatives and anaesthetics are needed. Indeed, some procedures are now undertaken without any anaesthetic at all, something previously unthinkable.

Dr Spintge believes the rhythmic components of the music are the most effective in his work. The pieces specially composed to create specific physiological change in his patients lock into the innate neurophysiological and biological rhythms that underlie the vital functions of the body. Spintge agrees that part of the value of the music is that it distracts the mind and allows the patient to "escape" into some favourite situation. However, the potency of music to change the physiological state goes beyond distraction.

These examples, sampled from a much larger body of research and clinical practice, demonstrate that music does seem to be, as its Sanskrit name "sangita" suggests, at once a true language, an endless game, and an outgrowth of the very roots of our being as moving, time-conscious creatures. It appears to be one of the earliest sciences of healing, and still relevant for this purpose today.

Paul Robertson is leader of the Medici String Quartet and Visiting Professor of Music and Psychiatry at Kingston University. He presents 'Music and the Mind', a three-part series starting tonight, Channel 4, 9pm.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in