Brain rhythms, speech, and hearing

As per the post on my other blog, I wonder if those same brain rhythms, supplemented by voice as in the anthology Let Me Tell You a Story, might enhance the reading experience for people whose reading is interrupted by stumbles over unfamiliar words. The article below is re-published with permission.

In loud rooms our brains ‘hear’ in a different way – new findings

Joachim Gross, University of Glasgow and Hyojin Park, University of Glasgow

When we talk face-to-face, we exchange many more signals than just words. We communicate using our body posture, facial expressions and head and eye movements; but also through the rhythms that are produced when someone is speaking. A good example is the rate at which we produce syllables in continuous speech – about three to seven times a second. In a conversation, a listener tunes in to this rhythm and uses it to predict the timing of the syllables that the speaker will use next. This makes it easier for them to follow what is being said.

Many other things are also going on. Using brain-imaging techniques we know for instance that even when no one is talking, the part of our brain responsible for hearing produces rhythmic activity at a similar rate to the syllables in speech. When we listen to someone talking, these brain rhythms align to the syllable structure. As a result, the brain rhythms match and track in frequency and time the incoming acoustic speech signal.

Hit that perfect beat.
DesignPrax

When someone speaks, we know their lip movements help the listener, too. Often these movements precede the speech – opening your mouth, for example – and provide important cues about what the person will say. Yet even on their own, lip movements contain enough information to allow trained observers to understand speech without hearing any words – hence some people can lip-read, of course. What has been unclear until now is how these movements are processed in the listener’s brain.

Lip-synching

This was the subject of our latest study. We already knew that it is not just a speaker’s vocal chords that produce a syllable rhythm, but also their lip movements. We wanted to see whether listeners’ brain waves align to speakers’ lip movements during continuous speech in a comparable way to how they align to the acoustic speech itself – and whether this was important for understanding speech.

Our study has revealed for the first time that this is indeed the case. We recorded the brain activity of 44 healthy volunteers while they watched movies of someone telling a story. Just like the auditory part of the brain, we found that the visual part also produces rhythms. These align themselves to the syllable rhythm that is produced by the speaker’s lips during continuous speech. And when we made the listening conditions more difficult by adding distracting speech, which meant that the storyteller’s lip movements become more important to understand what they were saying, the alignment between the two rhythms became more precise.

Helpful lips.
Rocketclips, Inc

In addition, we found that the parts of the listener’s brain that control lip movements also produce brain waves that are aligned to the lip movements of the speaker. And when these waves are better aligned to the waves from the motor part of the speaker’s brain, the listener understands the speech better. This supports the idea that brain areas that are used for producing speech are also important for understanding speech, and could have implications for studying lip-reading between people with hearing difficulties. Having shown this in relation to a speaker and listener, the next step will be to look at whether the same thing happens with brain rhythms during a two-way conversation.

Why are these insights interesting? If it is correct that speech normally works by establishing a channel for communication through aligning brain rhythms to speech rhythms – similar to tuning a radio to a certain frequency to listen to a certain station – our results suggest that there are other complementary channels that can take over when necessary. Not only can we tune ourselves to the rhythms from someone’s vocal chords, we can tune into the equivalent rhythms from their lip movement. Instead of doing this with the auditory part of our brain, we do it through the parts associated with seeing and movement.

And neither do you need to be a trained lip-reader to benefit – this is why even in a noisy environment such as a pub or a party, most people can still communicate with each other.

The Conversation

Joachim Gross, Professor in Psychology, University of Glasgow and Hyojin Park, Research Associate, University of Glasgow

This article was originally published on The Conversation. Read the original article.

Advertisements

Put off reading by tricky words?

Put off reading by tricky words; words you probably know when you hear them but can’t ‘hear’ them when you see them?

Can’t find the beat in a poem unless Snoop Dogg’s run a bass line under it?

Can’t find your glasses so …

 

What if you could have someone read it to you – and not the creepy bloke who always sits next to you on the bus even when there are lots of empty seats? Someone who knows how the poem or the story or whatever should read? The author, maybe?

Well you could and a new anthology shows you how. Slap a QR code under the headline or the title, link that to a sound file, and hey presto – the author is in your ears at the same time as the words are under your eyes.

I won’t be coy; I’m the editor. I’m being a bit coy though when I say I think this is a first because if I weren’t I’d be foghorning that I think it’s the first time ever, anywhere in the world, that a book or a leaflet or any other piece of print has ever used a QR code ever to help a reader make more sense of the words by letting them hear it read. [Did I mention ever?] It’s discrete, it’s private, with your ear buds in no one knows you’re getting assistance. And if you have a really big problem reading; so big someone always has to read even the most embarrassing stuff to you, wouldn’t this be handy on doctors’ leaflets?

I think the book shows you how, with a bit of techy magic, we could all be more independent, bigger readers, beat poets in our own heads.

#QR4PR – a QR code for a Personal Reader? Tell your doctor, your dentist, pharmacist, physio, the folk who write the instructions for DIY tools or self assembly wardrobes; tell them you want one.

Personal Readers – a new application for QR codes?

For most of my clinical career I worked with people with intellectual disabilities, and if you spend any time at all with people who find reading difficult, you soon realise how much literacy matters. Assessing decisional capacity, I listened as people accused of breaching contracts or tenancy agreements, stumbled through them word by word and sometimes syllable by syllable, arriving finally at the end with no idea of what it all meant. I also saw the blank faces at meetings where, despite supplementary documents being produced using EasyRead to maximise inclusivity, understanding was limited. People were distracted by the images used to support the words, and the words never gave as much information as the documents everyone else had.

But I knew from talking to many of the same people how adept they could be at expressing themselves. Tone of voice said more than the word itself, and the spaces between their words were only about assembling the information they needed, not about ignorance of language per se. Often, they knew the words when they heard them, they just couldn’t read them.

Much the same went for fiction. Not so important you might think but if there’s nothing to read that you can identify with and doesn’t dumb you down, why would you bother trying? Where were the genre stories for adults with intellectual disabilities? The crime, the romance, sci fi, fantasy, horror? I searched and came up blank, yet almost everyone I knew followed the soaps. I could get people to role play if they could be characters from EastEnders, and I needed to be on the ball myself or I could easily be  duped. One woman with Down’s did just that; flaked out across the chairs in my office, she looked the picture of imminent collapse. The GP advised an ambulance and paramedics were scrambled. They were there in minutes, took pulses, put her into their cardiac ambulance no less, and attached electrodes. She checked out fit as a flea so they called her carers, and off she went home. That night, I saw a clip of a previous EastEnders episode showing that exact scenario.

People clearly want, consume, and enjoy fiction. Many are actors and some, like Sarah Gordy and Tommy Jessop, achieve prominence in prime time TV shows. Others write copious accounts of their lives via volunteers, and some have had their poems and stories transcribed and put into book form. But then that’s it – if they can’t read what’s been written on their behalf it becomes ephemeral and it’s lost to them.

Technology hasn’t been hugely helpful, despite developments that have shown early promise, as most have so far ended up being too complicated for everyday use. But screen readers are improving and the NHS increasingly uses videos to show procedures patients might need to undergo. But people are easily bored when an activity is passive and even a short film with no interactivity, especially if the person doesn’t feel involved in it, can lead to rapid switch-off.

We tried a different approach to preparing people for a hospital admission. We took volunteers with intellectual disabilities into a virtual world and let them navigate rooms based on their local hospital. They loved it. They wanted to go back. They liked the seafront and wanted to see if their own houses were there, or their favourite shops.

But the programme, Second Life, was a step too far for an organisation such as the NHS. It had to be downloaded onto computers and it seemed to have no rules. It could also be buggy, laggy, and unstable and required someone on hand who knew how to nurse it better and make it work. Not an ideal use of people’s time.

So that’s on hold (never say never) but what was clear from that experience was that our participants, some with quite severe intellectual limitations, ‘got it’. They understood the semi-fictional nature of the environment, they engaged with it to the extent that one managed to get himself out of the hospital build altogether and onto what seemed to be a space station! They made up stories about things they saw when memory failed them, and they told other stories about their own hospital experiences triggered by seeing the operating theatre or the waiting room. We should keep this in mind; virtual reality is coming on in leaps and bounds and its potential for clinical, social and many other applications is phenomenal.

For now though, and back to the reading problem, we have a new opportunity, something so simple it seems ridiculous – pop a QR code onto a leaflet, the top of a short story, someone’s poem, and link it to an audio file of the words being spoken[1].

I’ve argued elsewhere for this in the interests of privacy. Too many times I saw vulnerable adults having sensitive information read to them by someone else. Usually it was a benign individual – a parent or sibling; but occasionally it wasn’t; it was a young child, a new carer, or sometimes a person I strongly suspected was an abuser who might use the information to further belittle or exploit the other person.

This doesn’t need to happen if people can have such material read to them privately any time they want via one of the many, almost ubiquitous, smart devices now available. These are already being used to help people travel independently, to get texts about their bus service and check the route as they go, or to alert specific people if they feel unsafe. A free scanner app to hear someone read an information leaflet about your cervical smear test or testicular cancer examination is barely any more complicated.

My personal campaign, #QR4PR, (a QR code for a Personal Reader) is a first step to getting QRs on as many NHS and other public information leaflets as possible. If you’ve ever sat in the doctor’s waiting room with a leaflet you’ve been given to read before you go in and you forgot your glasses, you’ll know how it feels to be disconnected from what you need to know. But a quick scan and you’d be back in touch with no need for an embarrassing scout around for someone willing to read it for you.

The final bonus; beyond privacy, dignity, inclusivity, and accessibility; is readability. One thing writers of fiction know full well is that reading your work aloud leads to better writing on the page. So apart from anything else, if the professionals who write these things have to record them too, they’ll certainly write something that is easier to read!

 

[1] What I believe to be the first use of this print/QR/audio combination is demonstrated in the anthology, Let Me Tell You a Story.