Communication
Can sign language tools help improve care for Deaf patients?
A research project in Amsterdam is developing a machine translation tool to improve communication between healthcare professionals and Deaf patients. Abi Millar finds out more.
T
he Covid-19 pandemic has been a uniquely challenging time for those who are deaf or hard of hearing. Most obviously, face masks render lip-reading impossible. They also muffle speech and remove other cues like facial expressions – a real issue for people with minor hearing loss who would previously have been able to get by.
On top of that, many deaf people have struggled with the shift to video meetings, few of which accommodate their needs. Some have skipped testing or vaccination appointments, which often must be booked by telephone only. Others have grappled with a lack of accurate information, such as when the UK Government failed to provide sign language interpreters during its daily coronavirus briefings.
The worst experiences, however, have been reserved for hospitalised patients who rely on sign language. At the height of the pandemic, patients were not allowed access to an interpreter – meaning they had very limited means of communicating with healthcare staff.
“It’s important in medical settings, where you want to avoid any miscommunication, that you communicate with people in their mother tongue,” says Lyke Esselink, a researcher at SignLab Amsterdam. “Written language is not Deaf people’s mother tongue.”
While some deaf people have benefited from written notes – along with apps like Google’s Live Transcribe – that hasn’t always been the case. Those who were born deaf, or lost their hearing early in life, tend to speak sign language as their first language (the convention is to capitalise ‘Deaf’ to describe this community). An estimated seven in 10,000 people are pre-lingually Deaf, and many struggle to learn to read.
“Languages are largely phonetic, and we learn to recognise written language based on phonemes as well,” says Esselink. “It would be like having to learn Thai, for example, without knowing what the alphabet sounds like.”
Without access to an interpreter, non-literate Deaf patients may struggle to get all the information they need – and the challenges have been exacerbated during Covid-19.
An animated interpreter
SignLab Amsterdam, a cross-faculty research lab at the University of Amsterdam, is working on a possible solution to this problem: a machine translation tool that converts written speech into sign language. The technology is not designed to replace human interpreters, but it could play a vital role in situations where interpreters aren’t available.
“If the doctor is sitting behind the desk and has their laptop, or maybe a phone or a tablet, they can type a question or a statement that they want to show to the patient,” says Esselink. “They press play, and then on the screen is an animation relaying that question or statement.”
Although more complex communication is not yet possible, the tool represents a technological leap forward.
The technology features an animated avatar that can sign basic phrases. For instance, the doctor could ask if the patient is experiencing certain symptoms, or the nurse could tell the patient that the doctor will be with them soon.
Although more complex communication is not yet possible – informed consent, for instance, is some way down the line – the tool represents a technological leap forward.
“There are some applications already available in which people translate text sign language, but the sentences need to be recorded in their entirety,” says Esselink. “Even in a closed domain, like a hospital, there are endless variations on sentences, and recording them one by one is really not feasible. So, what we're trying to do is dynamically generate the sentences. This is largely unexplored territory.”
SignLab Amsterdam’s machine translation tool converts written speech into sign language that is displayed by an avatar.
The challenges of virtual translation
To understand why this is so difficult, it’s important to know that sign languages function very differently from spoken languages. You cannot simply translate a sentence word-for-word into a string of consecutive signs.
“Sign languages are all different languages with their own rules, and the signs adapt based on the sentence,” says Esselink. “A large part of sign languages is the non-manual aspects. What do the eyebrows do? What facial expression do you have? Are you mouthing anything? What's your posture? These things can affect the meaning of a sign and of a sentence.”
SignLab’s software works by taking the text and converting it into a ‘gloss’ – a list of the appropriate signs in textual form. For instance, if the text asked, “Have you finished eating?”, the gloss would be “you, eating, done”, plus an eyebrow raise to indicate this is a question.
“From those glosses, we move on to code, and that code tells the animation software what the avatar should do,” says Esselink. “So, what are they doing with their eyebrows? What movements are they making?”
She elaborates that, much like written languages, sign languages have phonemes. However, while the phonemes of written languages represent the sound of individual syllables, a sign is made up from a combination of hand shape, location, orientation, movement and non-manual features – phonemes in their own right. SignLab’s code takes the phonetic representation of a sign and turns it into an animation.
The path to the clinic
Ingenious though it is, SignLab’s tool is not yet at the stage where it can be used in real-life clinical settings.
“We definitely need to research this more before we would feel comfortable deploying it,” says Esselink. “Obviously, if you deploy it in a medical setting, then the Deaf people would want the information to be really clear – there can't be any communication issues remaining. We're working really hard on improving this further and getting it there.”
Although conceding that the research could take years, Esselink is hopeful about its potential further down the line. Over the medium term, the goal is to improve access to information – for instance, enabling Deaf people to access certain websites in their mother tongue, or translating information at the train station. Over the longer term, the applications could be extensive.
Over the medium term, the goal is to improve access to information – for instance, enabling Deaf people to access certain websites in their mother tongue.
“We want to make a tool that can translate any text to sign language and might also be used in actual communication between people,” says Esselink. “So, if you encounter a Deaf person, and they ask you a question, you can communicate back with them. Eventually, it could also be used for people who are learning sign language and want to know how to translate a certain sentence.”
SignLab’s tool won’t put interpreters out of a job any time soon. Deaf people would be unlikely to want this, even if the technology was there. However, in situations where interpreters aren’t around, or aren’t permitted, the tool could help remove communication barriers – and could dramatically step up the quality of clinical care.