‘We are all synaesthetes to some extent,’ says Professor Charles Spence of the Department of Experimental Psychology.
Synaesthesia is a neurological condition that often involves a ‘blending of the senses’. It is thought to affect less than 1% of the population, and people experience it in a variety of ways.
Some people may ‘see’ sounds, in that hearing sounds triggers them to see particular colours at the same time, while others might experience colours while reading simple black text. Whatever the sensory connections an individual experiences, it is always the same – particular tones or words will stimulate precisely the same colours or tastes. [For more on the condition, listen to Professor Irene Tracey on Inside Oxford Science]
What Charles Spence and colleague Cesare Parise have shown is that everybody’s brain automatically combines sights and sounds that are likely to be related in a similar way. Their findings are published in the journal PLoS One.
The researchers suggest the brain has evolved this ability to merge related information from different senses so it can effectively pick its way through all the sounds, sights and sensations the body is constantly bombarded with.
‘It’s why at a noisy cocktail party you can tell who is speaking with which voice. And it’s why you can picture the size of a dog by hearing its growl – the brain associates a low note or sound with a large object,’ says Professor Spence.
Professor Spence and Cesare Parise sat 12 volunteers in front of a screen and gave them headphones. They presented the volunteers with an image at a slightly different time to a sound tone of a certain frequency, and asked them which came second. In a second experiment, they then played sounds at different positions to the left or right of the image, and asked the volunteer which side it came from.
All the volunteers considered certain images and sounds to be associated, such as small shapes and high pitched tones, or conversely large shapes and a low pitches (see video). Similarly, sharp irregular shapes are connected to high pitches while more curved shapes go with low pitches.
When the image and sound were related like this, the brain tended to process the information together and the volunteers were much less able to tell which came second, or where the sound came from.
‘Knowing how the brain connects information from different senses, we should be able to design warning signals that combine lights and sounds to really drive attention and get people to respond swiftly and accurately,’ says Cesare Parise. ‘This could be really important in an aircraft cockpit, for example, where a pilot has to deal with a great deal of information all at once.’
‘Or synaesthetic associations could be exploited to develop more effective sensory substitution devices for people who are deaf or blind.’
Professor Spence gives another example: ‘We are currently working with Heston Blumenthal to come up with appropriate names for some of the dishes at his restaurant The Fat Duck, in Bray. For example, we find that people tend to associate the sound of a word like ‘bubu’ more with soft, creamy textures like brie and ‘kiki’ with the sharp taste of cranberries, say. So maybe we could come up with appropriate names to fit with a soft ice cream that actually has an acidic note in tasting it, or a dish with a sharp texture but a creamy taste.’
Image above: Wensleydale with cranberries: ‘bubu’ or ‘kiki’?