How the brain processes sign language

The ability to speak is one of the essential characteristics that distinguishes humans from other animals. Many people would probably intuitively equate speech and language. However, cognitive science research on sign languages since the 1960s paints a different picture: Today it is clear, sign languages are fully autonomous languages and have a complex organization on several linguistic levels such as grammar and meaning. Previous studies on the processing of sign language in the human brain had already found some similarities and also differences between sign languages and spoken languages. Until now, however, it has been difficult to derive a consistent picture of how both forms of language are processed in the brain.

Researchers at the MPI CBS now wanted to know which brain regions are actually involved in the processing of sign language across different studies—and how large the overlap is with brain regions that hearing people use for spoken language processing. In a meta-study recently published in the journal Human Brain Mapping, they pooled data from sign language processing experiments conducted around the world. “A meta-study gives us the opportunity to get an overall picture of the neural basis of sign language. So, for the first time, we were able to statistically and robustly identify the brain regions that were involved in sign language processing across all studies,” explains Emiliano Zaccarella, last author of the paper and group leader in the Department of Neuropsychology at the MPI CBS.

Read more: Medical Xpress

Sign languages are fully-fledged, natural languages with their own dialects – they need protecting

We most often think of indigenous languages in the context of colonisation – languages used by people who originally inhabited regions that were later colonised. These are the languages that the UN had in mind when it stated a deep concern about the vast number of endangered indigenous languages. And rightly so. More than 2,400 of the about 7,000 languages used around the world today are endangered and most of these are indigenous languages in the above sense.

It’s welcome, then, that 2019 marks the International Year of Indigenous Languages, along with the awareness raising this will bring, as indigenous communities who speak these languages are often marginalised and disadvantaged. But there are other communities who speak indigenous languages that may still not receive much attention: deaf communities around the world who use sign languages.

Linguistic diversity

Sign languages are fully-fledged, complex, natural languages, with their own grammar, vocabulary, and dialects. There are over 140 recorded living sign languages in the world today.

These sign languages have evolved naturally, just like spoken languages. There is no “universal” sign language that is understood by all deaf communities around the world. For example, British Sign Language and American Sign Language are completely unrelated languages; speakers of these two languages cannot understand each other without the help of an interpreter.

Overall, indigenous peoples and their languages drive much of the world’s cultural and linguistic diversity, and sign languages make up only a small portion of this. But the particular diversity that sign languages exhibit contributes tremendously to our understanding of what language is.

Sign languages are acquired and processed in the brain just like spoken languages and fulfil all the same communicative functions. Yet they do so through vastly different means. Sign languages and tactile sign languages have taught us that our capacity for language is independent of any medium.

Any part of our upper body can be involved in language production and can carry grammar, as in American Sign Language, where facial expressions have grammatical functions. We can understand languages not just by hearing, but also through sight and touch. This realisation has contributed greatly to our understanding of the capacity for language in humans.

Read more: The Conversation

Glove turns sign language into text for real-time translation

Handwriting will never be the same again. A new glove developed at the University of California, San Diego, can convert the 26 letters of American Sign Language (ASL) into text on a smartphone or computer screen. Because it’s cheaper and more portable than other automatic sign language translators on the market, it could be a game changer. People in the deaf community will be able to communicate effortlessly with those who don’t understand their language. It may also one day fine-tune our control of robots.

ASL is a language all of its own, but few people outside the deaf community speak it. For many signing is their only language, as learning written English, for example, can be difficult without having the corresponding sounds to go with it.

“For thousands of people in the UK, sign language is their first language,” says Jesal Vishnuram, the technology research manager at the charity Action on Hearing Loss. “Many have little or no written English. Technology like this will completely change their lives.”

When they need to communicate with people who are not versed in ASL, their options are limited. In the UK, someone who is deaf is entitled to a sign language translator at work or when visiting a hospital, but at a train station, for example, it can be incredibly difficult to communicate with people who don’t sign. In this situation a glove that can translate for them would make life much easier.

Read more: New Scientist

Understanding the amazing complexity of sign language

Most people are familiar with sign language, the system that deaf people use to communicate. What fewer may know is that there are many different sign languages around the world, just as there are many different spoken languages.

So how does the grammar of sign language work?

Unlike in spoken languages, in which grammar is expressed through sound-based signifiers for tense, aspect, mood and syntax (the way we organise individual words), sign languages use hand movements, sign order as well as body and facial cues to create grammar. This is called non-manual activity.

To find out whether these cues are comprehensible to signers and non-signers of a country, my team of deaf and hearing linguists and translators conducted two studies. The results, which will be published in July, demonstrate the incredible complexity of sign language.

Read more: The Conversation

How Movie Magic Could Help Translate for Deaf Students

The technology behind the cooking rats in Ratatouille and the dancing penguins in Happy Feet could help bridge stubborn academic gaps between deaf and hearing students. Researchers are using computer-animation techniques, such as motion-capture, to make life-like computer avatars that can reliably and naturally translate written and spoken words into sign language, whether it’s American Sign Language or that of another country.

English and ASL are fundamentally different languages, said computer scientist Matthew Huenerfauth, director of the Linguistic and Assistive Technologies Laboratory at the Rochester Institute of Technology, and translation between them “is just as hard as translating English to Chinese.” Programming avatars to perform that translation is much harder. Not only is ASL grammar different from English, but sign language also depends heavily on facial expressions, gaze changes, body positions, and interactions with the physical space around the signer to make and modify meaning. It’s translation in three dimensions.

About three-quarters of deaf and hard-of-hearing students in America are mainstreamed, learning alongside hearing students in schools and classes where sign-language interpreters are often in short supply. On average, deaf students graduate high school reading English—a second language to them—at a fourth-grade level, according to a report out of Gallaudet University, the premier university for deaf students. That reading deficit slows their learning in every other subject. It also limits the usefulness of closed captioning for multimedia course material.

Read more: Slate

The people behind India’s first sign language dictionary

Apart from a few clicks on computer keyboards and the routine shuffling of paper, it is very quiet at this office in the capital, Delhi. A group of 15 people here are working on the massive task of compiling more than 7,000 signs that deal with words used in academic, medical, legal, technical and routine conversations by deaf people in India.

The group is a mix of speech and hearing impaired, deaf and non-disabled people. However they all communicate with each other in sign language.

Sign language has evolved in India over the last 100 years, but the government has only now decided to codify it in the form of a dictionary.

Once completed, it will translate Indian sign language into English and Hindi, and will be available in print and online editions.

Read more: BBC

Unusual Condition Lets People See Sign Language in Colors

People who use sign language can experience synesthesia, a rare condition that mixes sensory information from different sources, causing people to see letters in certain colors, or taste words, a new study finds.

The study is the first to document synesthesia among sign language users, the researchers said. The people in the study reported that they saw different colors when they watched someone make the signs for various letters and numbers.

The finding shows that “synesthesia occurs in sign language as well as spoken language,” said study lead researcher Joanna Atkinson, a researcher at the Deafness, Cognition and Language Research Centre at University College London. The condition occurs in about 4 percent of the population, including those who experience it with braille letters or while reading sheet music, the researchers said.

Synesthesia is thought to happen because of the wiring in the brain, the researchers said. For instance, this wiring may lead to “cross-activation in neural areas that do not usually interact.”

Read more: Live Science

Endangered Language: How Technology May Replace Braille and Sign

It’s hard to think about language as being endangered or replaceable. But as our culture and means of communication evolve, certain languages find their utility in decline. Braille and sign language are in just such a predicament.

Technological advancements such as Voice-to-text, digital audio, and the cochlear implant have steadily decreased the demand for these once revolutionary facilitators for the disabled. This hour, we’ll hear from members of the hearing and visually impaired communities about this controversial shift in their culture.

Read more: WNPR

These Students Created Software That Could Help Millions Of Deaf People In Brazil

SÃO PAULO — Approximately 5 percent of Brazilians have some kind of hearing impairment. Despite the widespread implementation of Brazilian Sign Language, also called Libras, in 2005, people with hearing impairments often need the help of translators to read texts, textbooks, websites and other reading material in standard Portuguese, which uses a different grammatical structure.

Hoping to improve the daily lives of people with hearing loss, three computer engineering students at Faculdade Independente do Nordeste (Northeast Independent School) in the Brazilian state of Bahia have created Librol, a software program that translates text from Portuguese to Libras.

Raira Carvalho, 19, André Ivo, 19, and Jennifer Brito, 21, believe that Librol will grant greater autonomy to Brazilians with hearing impairments. It offers a way to read and study without the presence of a human translator.

Read more: Huffington Post

More people want to learn sign language than French or German

More people want to learn sign language than French and German, a study shows today.

And a survey by the National Deaf Children’s Society shows two out of three adults think sign language is more impressive than speaking a foreign language.

One in four people in Britain say they want to learn sign language, which would total 12.7m adults.

The top three languages people would like to learn are Spanish (28%), British Sign Language (24%) and French (23%).

Read more: Mirror

This Smart Glove Translates Sign Language to Text and Speech

Hadeel Ayoub, a Saudi designer and media artist, wants to bridge the communication gap between people with and without hearing disabilities. After a year of tinkering, she’s come up with a “smart glove” that converts sign language into text and speech.

“I have an autistic niece who is four and who doesn’t speak. When I saw her communicating with sign language, I wondered what would happen if she tried to communicate with someone who doesn’t speak the same language,” Ayoub told me over the phone. “Does that mean she wouldn’t be able to get through to them?”

Ayoub told me that her motto is always “to design for a cause,” and to produce products that will help a wider community. With the “smart glove,” her aim is to “facilitate communication” between people, and break down language barriers in the process.

Read more: Motherboard

The Life and Death of Martha’s Vineyard Sign Language

From its founding in 1640 through the end of the 1800s, people who were born in Chilmark, a small town on the western end of Martha’s Vineyard, also tended to die in Chilmark.

Two of those people were the children of Jonathan Lambert, a man who had come to Chilmark from Kent, England, in the late 1600s. According to island records, Lambert was deaf; his children, born after his arrival, were the first congenitally deaf residents of Martha’s Vineyard. They were also the beginning of a language and deaf culture unique to the island—one that used to thrive, but is now extinct.

Read more: The Atlantic