These gloves let you ‘hear’ sign language

July 2nd, 2020 by An estimated half a million Americans with hearing impairments use American Sign Language (ASL) every day. But ASL has one shortcoming: While it allows people who are deaf to communicate with one another, it doesn’t enable dialogue between the hearing and the nonhearing. Researchers at UCLA have developed a promising solution. It’s a translation glove. The glove, which slips onto your hand like any other glove, features stretchy, polyester-wrapped wires that can track the positions of your fingers. Within a second, an onboard processor can translate those finger movements to one of more than 600 signs with a remarkable 98.63% accuracy. The results are beamed via Bluetooth to a companion app on your phone, which reads the words aloud. Jun Chen, the assistant professor at UCLA’s Department of Engineering who led the research, tells us he was inspired to create the glove after being frustrated when trying to talk to a friend with hearing impairment. He looked at other solutions that had been proposed to translate ASL and found them imperfect. Vision recognition systems need the right lighting to make fingers legible. Another proposed solution, which can track the electrical impulses through your skin to read signs, requires the precise placement of sensors to get proper measurements. Read more: Fast Company

Sign languages emerge just like verbal languages do: Linguist

September 23rd, 2019 by Sign languages emerge through a natural process in deaf communities just as verbal languages do, after a language community is formed, according to a linguist. “Sign languages are perceived as the translated language of verbal language, on the contrary, they are independent,” Zeynep Acan Aydin, a linguist at Hacettepe University, told Anadolu Agency, as the world marks International Day of Sign Languages on Monday. “Sign languages have unique set of universal features that are different from all verbal languages,” Aydin said, adding that speakers may prefer various dialects and variations depending on their age, educational level, gender and social status. She said the first thing that comes to mind about deafness is the loss of an ability. “Thus, sign language is seen as a substitute language, something inadequate and primitive, and that grammatical features are not like verbal languages; pantomime and gestures are perceived as a tool that provides poor communication. “However, hand signs in sign languages are conventional, there is a conventionality between the speakers,” Aydin stressed. Read more: Yeni Şafak

Sign languages are fully-fledged, natural languages with their own dialects – they need protecting

January 28th, 2019 by We most often think of indigenous languages in the context of colonisation – languages used by people who originally inhabited regions that were later colonised. These are the languages that the UN had in mind when it stated a deep concern about the vast number of endangered indigenous languages. And rightly so. More than 2,400 of the about 7,000 languages used around the world today are endangered and most of these are indigenous languages in the above sense. It’s welcome, then, that 2019 marks the International Year of Indigenous Languages, along with the awareness raising this will bring, as indigenous communities who speak these languages are often marginalised and disadvantaged. But there are other communities who speak indigenous languages that may still not receive much attention: deaf communities around the world who use sign languages. Linguistic diversity Sign languages are fully-fledged, complex, natural languages, with their own grammar, vocabulary, and dialects. There are over 140 recorded living sign languages in the world today. These sign languages have evolved naturally, just like spoken languages. There is no “universal” sign language that is understood by all deaf communities around the world. For example, British Sign Language and American Sign Language are completely unrelated languages; speakers of these two languages cannot understand each other without the help of an interpreter. Overall, indigenous peoples and their languages drive much of the world’s cultural and linguistic diversity, and sign languages make up only a small portion of this. But the particular diversity that sign languages exhibit contributes tremendously to our understanding of what language is. Sign languages are acquired and processed in the brain just like spoken languages and fulfil all the same communicative functions. Yet they do so through vastly different means. Sign languages and tactile sign languages have taught us that our capacity for language is independent of any medium. Any part of our upper body can be involved in language production and can carry grammar, as in American Sign Language, where facial expressions have grammatical functions. We can understand languages not just by hearing, but also through sight and touch. This realisation has contributed greatly to our understanding of the capacity for language in humans. Read more:

Glove turns sign language into text for real-time translation

July 13th, 2017 by Handwriting will never be the same again. A new glove developed at the University of California, San Diego, can convert the 26 letters of American Sign Language (ASL) into text on a smartphone or computer screen. Because it’s cheaper and more portable than other automatic sign language translators on the market, it could be a game changer. People in the deaf community will be able to communicate effortlessly with those who don’t understand their language. It may also one day fine-tune our control of robots. ASL is a language all of its own, but few people outside the deaf community speak it. For many signing is their only language, as learning written English, for example, can be difficult without having the corresponding sounds to go with it. “For thousands of people in the UK, sign language is their first language,” says Jesal Vishnuram, the technology research manager at the charity Action on Hearing Loss. “Many have little or no written English. Technology like this will completely change their lives.” When they need to communicate with people who are not versed in ASL, their options are limited. In the UK, someone who is deaf is entitled to a sign language translator at work or when visiting a hospital, but at a train station, for example, it can be incredibly difficult to communicate with people who don’t sign. In this situation a glove that can translate for them would make life much easier. Read more: New Scientist

Understanding the amazing complexity of sign language

June 23rd, 2017 by Most people are familiar with sign language, the system that deaf people use to communicate. What fewer may know is that there are many different sign languages around the world, just as there are many different spoken languages. So how does the grammar of sign language work? Unlike in spoken languages, in which grammar is expressed through sound-based signifiers for tense, aspect, mood and syntax (the way we organise individual words), sign languages use hand movements, sign order as well as body and facial cues to create grammar. This is called non-manual activity. To find out whether these cues are comprehensible to signers and non-signers of a country, my team of deaf and hearing linguists and translators conducted two studies. The results, which will be published in July, demonstrate the incredible complexity of sign language. Read more: The Conversation

How Movie Magic Could Help Translate for Deaf Students

May 17th, 2017 by The technology behind the cooking rats in Ratatouille and the dancing penguins in Happy Feet could help bridge stubborn academic gaps between deaf and hearing students. Researchers are using computer-animation techniques, such as motion-capture, to make life-like computer avatars that can reliably and naturally translate written and spoken words into sign language, whether it’s American Sign Language or that of another country. English and ASL are fundamentally different languages, said computer scientist Matthew Huenerfauth, director of the Linguistic and Assistive Technologies Laboratory at the Rochester Institute of Technology, and translation between them “is just as hard as translating English to Chinese.” Programming avatars to perform that translation is much harder. Not only is ASL grammar different from English, but sign language also depends heavily on facial expressions, gaze changes, body positions, and interactions with the physical space around the signer to make and modify meaning. It’s translation in three dimensions. About three-quarters of deaf and hard-of-hearing students in America are mainstreamed, learning alongside hearing students in schools and classes where sign-language interpreters are often in short supply. On average, deaf students graduate high school reading English—a second language to them—at a fourth-grade level, according to a report out of Gallaudet University, the premier university for deaf students. That reading deficit slows their learning in every other subject. It also limits the usefulness of closed captioning for multimedia course material. Read more: Slate

The people behind India’s first sign language dictionary

March 3rd, 2017 by Apart from a few clicks on computer keyboards and the routine shuffling of paper, it is very quiet at this office in the capital, Delhi. A group of 15 people here are working on the massive task of compiling more than 7,000 signs that deal with words used in academic, medical, legal, technical and routine conversations by deaf people in India. The group is a mix of speech and hearing impaired, deaf and non-disabled people. However they all communicate with each other in sign language. Sign language has evolved in India over the last 100 years, but the government has only now decided to codify it in the form of a dictionary. Once completed, it will translate Indian sign language into English and Hindi, and will be available in print and online editions. Read more: BBC

Unusual Condition Lets People See Sign Language in Colors

July 19th, 2016 by People who use sign language can experience synesthesia, a rare condition that mixes sensory information from different sources, causing people to see letters in certain colors, or taste words, a new study finds. The study is the first to document synesthesia among sign language users, the researchers said. The people in the study reported that they saw different colors when they watched someone make the signs for various letters and numbers. The finding shows that "synesthesia occurs in sign language as well as spoken language," said study lead researcher Joanna Atkinson, a researcher at the Deafness, Cognition and Language Research Centre at University College London. The condition occurs in about 4 percent of the population, including those who experience it with braille letters or while reading sheet music, the researchers said. Synesthesia is thought to happen because of the wiring in the brain, the researchers said. For instance, this wiring may lead to "cross-activation in neural areas that do not usually interact." Read more: Live Science

Endangered Language: How Technology May Replace Braille and Sign

June 20th, 2016 by It's hard to think about language as being endangered or replaceable. But as our culture and means of communication evolve, certain languages find their utility in decline. Braille and sign language are in just such a predicament. Technological advancements such as Voice-to-text, digital audio, and the cochlear implant have steadily decreased the demand for these once revolutionary facilitators for the disabled. This hour, we'll hear from members of the hearing and visually impaired communities about this controversial shift in their culture. Read more: WNPR

These Students Created Software That Could Help Millions Of Deaf People In Brazil

February 4th, 2016 by SÃO PAULO -- Approximately 5 percent of Brazilians have some kind of hearing impairment. Despite the widespread implementation of Brazilian Sign Language, also called Libras, in 2005, people with hearing impairments often need the help of translators to read texts, textbooks, websites and other reading material in standard Portuguese, which uses a different grammatical structure. Hoping to improve the daily lives of people with hearing loss, three computer engineering students at Faculdade Independente do Nordeste (Northeast Independent School) in the Brazilian state of Bahia have created Librol, a software program that translates text from Portuguese to Libras. Raira Carvalho, 19, André Ivo, 19, and Jennifer Brito, 21, believe that Librol will grant greater autonomy to Brazilians with hearing impairments. It offers a way to read and study without the presence of a human translator. Read more: Huffington Post‎

More people want to learn sign language than French or German

October 16th, 2015 by More people want to learn sign language than French and German, a study shows today. And a survey by the National Deaf Children’s Society shows two out of three adults think sign language is more impressive than speaking a foreign language. One in four people in Britain say they want to learn sign language, which would total 12.7m adults. The top three languages people would like to learn are Spanish (28%), British Sign Language (24%) and French (23%). Read more: Mirror

This Smart Glove Translates Sign Language to Text and Speech

September 30th, 2015 by Hadeel Ayoub, a Saudi designer and media artist, wants to bridge the communication gap between people with and without hearing disabilities. After a year of tinkering, she’s come up with a “smart glove” that converts sign language into text and speech. “I have an autistic niece who is four and who doesn’t speak. When I saw her communicating with sign language, I wondered what would happen if she tried to communicate with someone who doesn’t speak the same language,” Ayoub told me over the phone. “Does that mean she wouldn’t be able to get through to them?” Ayoub told me that her motto is always “to design for a cause,” and to produce products that will help a wider community. With the “smart glove,” her aim is to “facilitate communication” between people, and break down language barriers in the process. Read more: Motherboard