To Communicate With Apes, We Must Do It On Their Terms

April 26th, 2018 by On August 24, 1661, Samuel Pepys, an administrator in England’s navy and famous diarist, took a break from work to go see a “strange creature” that had just arrived on a ship from West Africa. Most likely, it was a chimpanzee—the first Pepys had ever seen. As he wrote in his diary, the “great baboon” was so human-like that he wondered if it were not the offspring of a man and a “she-baboon.” “I do believe that it already understands much English,” he continued, “and I am of the mind it might be taught to speak or make signs.” Humans and apes share nearly 99% of the same DNA, but language is one thing that seems to irreconcilably differentiate our species. Is that by necessity of nature, though, or simply a question of nurture? “It could be that there’s something biologically different in our genome, something that’s changed since we split from apes, and that’s language,” says Catherine Hobaiter, a primatologist at the University of St. Andrews in Scotland. “But another possibility is that they might have the cognitive capacity for language, but they’re not able to physically express it like we do.” In the centuries since Pepys’ speculations, scientists and the public alike have only become more enamored with the idea of breaking through communication barriers separating our species. “It’s every scientist’s dream,” Hobaiter says. “Instead of having to do years of tests, we could just sit down and have a chat.” This not only would allow us to find out what our closest relatives are thinking, but also to potentially learn something about our own evolution and what it means to be human, says Jared Taglialatela, an associate professor at Kennesaw State University and the director of research at the Ape Cognition and Conservation Initiative. “We shared a common ancestor just 5.5 million years ago with both chimpanzees and bonobos,” he says. “That’s useful for making comparisons and answering questions about human origins.” Scientists have been trying to teach chimps to speak for decades, with efforts ranging from misguided to tantalizingly promising. Now, however, they are coming to realize that we’ve likely been going about it in the wrong way. Rather than force apes to learn our language, we should be learning theirs. Read more: NOVA Next

The Brain Has Its Own “Autofill” Function for Speech

May 4th, 2017 by The world is an unpredictable place. But the brain has evolved a way to cope with the everyday uncertainties it encounters—it doesn’t present us with many of them, but instead resolves them as a realistic model of the world. The body’s central controller predicts every contingency, using its stored database of past experiences, to minimize the element of surprise. Take vision, for example: We rarely see objects in their entirety but our brains fill in the gaps to make a best guess at what we are seeing—and these predictions are usually an accurate reflection of reality. The same is true of hearing, and neuroscientists have now identified a predictive textlike brain mechanism that helps us to anticipate what is coming next when we hear someone speaking. The findings, published this week in PLoS Biology, advance our understanding of how the brain processes speech. They also provide clues about how language evolved, and could even lead to new ways of diagnosing a variety of neurological conditions more accurately. The new study builds on earlier findings that monkeys and human infants can implicitly learn to recognize artificial grammar, or the rules by which sounds in a made-up language are related to one another. Neuroscientist Yukiko Kikuchi of Newcastle University in England and her colleagues played sequences of nonsense speech sounds to macaques and humans. Consistent with the earlier findings, Kikuchi and her team found both species quickly learned the rules of the language’s artificial grammar. After this initial learning period the researchers played more sound sequences—some of which violated the fabricated grammatical rules. They used microelectrodes to record responses from hundreds of individual neurons as well as from large populations of neurons that process sound information. In this way they were able to compare the responses with both types of sequences and determine the similarities between the two species’ reactions. Read more: Scientific American

AI Systems Are Learning to Communicate With Humans

May 3rd, 2017 by In the future, service robots equipped with artificial intelligence (AI) are bound to be a common sight. These bots will help people navigate crowded airports, serve meals, or even schedule meetings. As these AI systems become more integrated into daily life, it is vital to find an efficient way to communicate with them. It is obviously more natural for a human to speak in plain language rather than a string of code. Further, as the relationship between humans and robots grows, it will be necessary to engage in conversations, rather than just give orders. This human-robot interaction is what Manuela M. Veloso’s research is all about. Veloso, a professor at Carnegie Mellon University, has focused her research on CoBots, autonomous indoor mobile service robots which transport items, guide visitors to building locations, and traverse the halls and elevators. The CoBot robots have been successfully autonomously navigating for several years now, and have traveled more than 1,000km. These accomplishments have enabled the research team to pursue a new direction, focusing now on novel human-robot interaction. “If you really want these autonomous robots to be in the presence of humans and interacting with humans, and being capable of benefiting humans, they need to be able to talk with humans” Veloso says. Read more: Futurism

Different Written Languages Are Equally as Efficient at Conveying Meaning

February 12th, 2016 by The research, published in the journal Cognition, finds the same amount of time is needed for a person, from for example China, to read and understand a text in Mandarin, as it takes a person from Britain to read and understand a text in English – assuming both are reading their native language. Professor of Experimental Psychology at Southampton, Simon Liversedge, says: “It has long been argued by some linguists that all languages have common or universal underlying principles, but it has been hard to find robust experimental evidence to support this claim. Our study goes at least part way to addressing this – by showing there is universality in the way we process language during the act of reading. It suggests no one form of written language is more efficient in conveying meaning than another.” Read more: Neuroscience News‎

Welcome To The Era Of Big Translation

January 27th, 2016 by Despite the increasing ability to reach foreign customers, the lack of quality translation methods is still the most challenging aspect of global expansion. Currently, even using the most advanced software services are an expensive, complicated, and inaccurate process. The result is that too often, businesses sacrifice millions of dollars in profit because marketing to global consumers is too complex to be worthwhile. That’s why we need an era of "Big Translation," to leverage our existing technological tools and scale up translation capabilities to a level that actually matches global communication needs. Big Translation, a large-scale translation efforts by people speaking two or more languages, would allow businesses, individuals, and even tweeters to get what they want translated easily and affordably. The idea is banking on the fact that the world certainly isn’t lacking in translation talent. Nearly half the world speaks two or more languages—that’s 3.65 billion people with the potential to contribute to translation. Presently, a number of innovators are trying to tap into our world's language talent. Chief among them is the groundbreaking idea that bypasses traditional translation tools and moves to a more accessible mobile app platform. Read more: Fast Company‎

Understanding speech not just a matter of believing one’s ears

January 27th, 2016 by Even if we just hear part of what someone has said, when we are familiar with the context, we automatically add the missing information ourselves. Researchers from the Max Planck Institute for Empirical Aesthetics in Frankfurt and the Max Planck Institute for Cognitive and Brain Sciences in Leipzig have now succeeded in demonstrating how we do this. Incomplete utterances are something we constantly encounter in everyday communication. Individual sounds and occasionally entire words fall victim to fast speech rates or imprecise articulation. In poetic language, omissions are used as a stylistic device or are the necessary outcome of the use of regular metre or rhyming syllables. In both cases, our comprehension of the spoken content is only slightly impaired or, in most cases, not affected at all. The results of previous linguistic research suggest that language is particularly resilient to omissions when the linguistic information can be predicted both in terms of its content and phonetics. The most probable ending to the sentence: "The fisherman was in Norway and caught a sal…" is the word "salmon". Accordingly, due to its predictability, this sentence-ending word should be able to accommodate the omission of the "m" and "on" sounds. Read more: Medical Xpress‎

How a joke can help us unlock the mystery of meaning in language

December 17th, 2015 by What do you get if you cross a kangaroo with an elephant? You'll have to wait for the punchline, but you should already have shards of meaning tumbling about your mind. Now, jokes don't have to be all that funny, of course, but if they are to work at all then they must construct something beyond the simple words deployed. Language is the tissue that connects us in our daily social lives. We use it to gossip, to get a job, and give someone the sack. We use it to seduce, quarrel, propose marriage, get divorced and yes, tell the odd gag. In the absence of telepathy, it lets us interact with our nearest and dearest, and in our virtual web of digital communication, with hundreds of people we may never have met. But while we now know an awful lot about the detail of the grammatical systems of the world's 7,000 or so languages, scientific progress on the mysterious elixir of communication – meaning – has been a much tougher nut to crack. Read more: Phys.org‎

How Dictation Software is Changing the Way We Communicate

September 28th, 2015 by There’s often a noticeable difference between the way we write and the way we speak. Writing allows us to think before we type, and that extra step between a thought and its expression often leads to wording that's more precise—and more formal. But dictation software may be changing that. According to WIRED, as dictation software on smartphones and computers becomes more sophisticated, more people are using it to draft texts and emails. And although dictation is viewed by many as just a simple time-saver, it may also have an impact on the way we communicate. A small study in 2003 found that dictation software makes writing more casual. For instance, people are more likely to drop titles like “Mr.” and “Ms.” They may also use less complex language in general, in part because they fear dictation software won’t be able to pick it up. Designer Natalie Roth told WIRED, “I simplify what I’m saying so the computer will understand it. It’s the way I speak to someone when I know that their English is a bit rusty.” Read more: Mental Floss

Do Animals Have Their Own Language And Can They Talk To Each Other?

September 14th, 2015 by Can animals understand each other? And if they do, do they communicate through language? Is their language sophisticated enough to combine sounds into meaning in some grammatical sense? And can they talk about things that aren’t right in front of them? A lot of them can do some of those things! Read more: Gizmodo Australia

Sperm Whales’ Language Reveals Hints of Culture

September 9th, 2015 by New ways to grab dinner, the trick to using a tool, and learning the local dialect. These are behaviors that animals pick up from each other. Killer whales, chimpanzees, and birds seem to have a cultural component to their lives. Now a new study suggests that sperm whales should be added to that list. The ocean around the Galápagos Islands hosts thousands of female sperm whales and their calves that have organized into clans with their own dialects. (Mature males congregate in colder waters near the poles.) How these clans form has been something of a mystery until now. A study published Tuesday in the journal Nature Communications suggests that culture—behaviors shared by group members—keeps these sperm whale clans together. Specifically, these deep-diving whales have a distinct series of clicks called codas they use to communicate during social interactions. Read more: National Geographic