To the brain, reading computer code is not the same as reading language

December 15th, 2020 by In some ways, learning to program a computer is similar to learning a new language. It requires learning new symbols and terms, which must be organized correctly to instruct the computer what to do. The computer code must also be clear enough that other programmers can read and understand it. In spite of those similarities, MIT neuroscientists have found that reading computer code does not activate the regions of the brain that are involved in language processing. Instead, it activates a distributed network called the multiple demand network, which is also recruited for complex cognitive tasks such as solving math problems or crossword puzzles. However, although reading computer code activates the multiple demand network, it appears to rely more on different parts of the network than math or logic problems do, suggesting that coding does not precisely replicate the cognitive demands of mathematics either. “Understanding computer code seems to be its own thing. It’s not the same as language, and it’s not the same as math and logic,” says Anna Ivanova, an MIT graduate student and the lead author of the study. Evelina Fedorenko, the Frederick A. and Carole J. Middleton Career Development Associate Professor of Neuroscience and a member of the McGovern Institute for Brain Research, is the senior author of the paper, which appears today in eLife. Researchers from MIT’s Computer Science and Artificial Intelligence Laboratory and Tufts University were also involved in the study. Read more: MIT

Scientists make a spectacular discovery about the origin of language in the brain

April 21st, 2020 by Mammal brains, unlike bones, do not fossilize. This fact presents a conundrum for scientists trying to understand how we came to be how we are. For example, scientists have long argued over how humans became able to speak -- if only they had an ancient brain to settle the debate! But with no time machine, scientists must resort to what they have available, like the brains of animals today that closely resemble the long-extinct ancestors to us. Modern animal brains can open the door to history, and scientists use them to infer the workings of ancient minds. And so, in a new study, researchers in the United Kingdom have come to posit something rather spectacular about the very old brain: The origin of language in the brain is 20 million years older than we thought. That's quite a sizeable jump from the 5 million-year-old guess of earlier studies. In their research, scientists analyzed open-source brain scans and conducted their own scans on humans, monkeys, and apes. They "identified homologous pathways originating from the auditory cortex," state the authors in their study, which was published Monday in the journal Nature Neuroscience. While speech and language are exclusively human, the presence of this pathway in the brains of members of the monkey species suggests an evolution took place much earlier than we thought. Scientists assumed this pathway developed in the human brain 5 million years ago (when humans last shared an ancestor with chimps, members of the ape species). However, new observations across all three species (monkeys, apes, humans) date the emergence of the neural pathway to back to when humans last shared an ancestor with the macaque monkey, which was at least 25 million years ago. Read more: Inverse

Why the language-ready brain is so complex

October 21st, 2019 by In a review article published in Science, Peter Hagoort, professor of Cognitive Neuroscience at Radboud University and director of the Max Planck Institute for Psycholinguistics, argues for a new model of language, involving the interaction of multiple brain networks. This model is much more complex than the classical neurobiological model of language, which was largely based on single-word processing. The capacity for language is distinctly human. It allows us to communicate, learn things, create culture, and think better. Because of its complexity, scientists have long struggled to understand the neurobiology of language. In the classical view, there are two major language areas in the left half of our brain. Broca's area (in the frontal lobe) is responsible for the production of language (speaking and writing), while Wernicke's area (in the temporal lobe) supports the comprehension of language (listening and reading). A large fibre tract (the arcuate fasciculus) connects these two 'perisylvian' areas (around the Sylvian fissure, the split which divides the two lobes). "The classical view is largely wrong," says Hagoort. Language is infinitely more complex than speaking or understanding single words, which is what the classical model was based on. While words are among the elementary 'building blocks' of language, we also need 'operations' to combine words into structured sentences, such as 'the editor of the newspaper loved the article.' To understand and interpret such an utterance, knowing the speech sounds (or letters) and meaning of the individual words is not enough. For instance, we also need information about the context (who is the speaker?), the intonation (is the tone cynical?), and knowledge of the world (what does an editor do?). Read more: Medical Xpress

How Language Hijacked the Brain

June 10th, 2018 by I’m sitting in the sun on one of the first mild days of the spring, talking with a modern-day flintknapper about the origins of human language. His name is Neill Bovaird, and he’s neither an archaeologist nor a linguist, just a 38-year-old bearded guy with a smartphone in his pocket who uses Stone Age technology to produce Stone Age tools. Bovaird has been flintknapping for a couple decades, and as we talk, the gok gok gok of him striking a smaller rock against a larger one punctuates our conversation. Every now and then the gokking stops: A new flake, sharper than a razor blade, breaks off in his palm. I’ve come to see Bovaird, who teaches wilderness-survival skills in western Massachusetts, because I want to better understand the latest theories on the emergence of language—particularly a new body of research arguing that if not for our hominin ancestors’ hard-earned ability to produce complex tools, language as we know it might not have evolved at all. The research is occurring at the cutting-edge intersections of evolutionary biology, experimental archaeology, neuroscience, and linguistics, but much of it is driven by a very old question: Where did language come from? Oren Kolodny, a biologist at Stanford University, puts the question in more scientific terms: “What kind of evolutionary pressures could have given rise to this really weird and surprising phenomenon that is so critical to the essence of being human?” And he has proposed a provocative answer. In a recent paper in the journal Philosophical Transactions of the Royal Society B, Kolodny argues that early humans—while teaching their kin how to make complex tools—hijacked the capacity for language from themselves. Read more: The Atlantic

How does the brain process speech? We now know the answer, and it’s fascinating

June 3rd, 2018 by Neuroscientists have known that speech is processed in the auditory cortex for some time, along with some curious activity within the motor cortex. How this last cortex is involved though, has been something of a mystery, until now. A new study by two NYU scientists reveals one of the last holdouts to a process of discovery which started over a century and a half ago. In 1861, French neurologist Pierre Paul Broca identified what would come to be known as “Broca’s area.” This is a region in the posterior inferior frontal gyrus. This area is responsible for processing and comprehending speech, as well as producing it. Interestingly, a fellow scientist, whom Broca had to operate on, was post-op missing Broca’s area entirely. Yet, he was still able to speak. He couldn’t initially make complex sentences, however, but in time regained all speaking abilities. This meant another region had pitched in, and a certain amount of neuroplasticity was involved. In 1871, German neurologist Carl Wernicke discovered another area responsible for processing speech through hearing, this time in the superior posterior temporal lobe. It’s now called Wernicke’s area. The model was updated in 1965 by the eminent behavioral neurologist, Norman Geschwind. The updated map of the brain is known as the Wernicke-Geschwind model. Wernicke and Broca gained their knowledge through studying patients with damage to certain parts of the brain. In the 20th century, electrical brain stimulation began to give us an even greater understanding of the brain’s inner workings. Patients undergoing brain surgery in the mid-century were given weak electrical brain stimulation. The current allowed surgeons to avoid damaging critically important areas. But it also gave them more insight into what areas controlled what functions. With the advent of the fMRI and other scanning technology, we were able to look at the activity in regions of the brain and how language travels across them. We now know that impulses associated with language go between Boca’s and Wernicke’s areas. Communication between the two help us understand grammar, how words sound, and their meaning. Another region, the fusiform gyrus, helps us classify words. Read more: Big Think

Language Utilizes Ancient Brain Circuits That Predate Humans

January 31st, 2018 by A new paper by an international team of researchers presents strong evidence that language is learned using two general-purpose brain systems (declarative memory and procedural memory) that are evolutionarily ancient and not language specific. Contrary to popular belief, the researchers found that children learning their native language and adults learning a foreign language do not rely on brain circuitry specifically dedicated to language learning. Instead, language acquisition piggybacks on ancient, general-purpose neurocognitive mechanisms that preexist Homo sapiens. These findings were published online January 29, 2018, in the journal Proceedings of the National Academy of Sciences (PNAS). For this analysis, the research team statistically synthesized the findings of 16 previous studies that examined language learning via declarative and procedural memory, which are two well-studied brain systems. What Is the Difference Between Declarative Memory and Procedural Memory? Declarative memory refers to crystallized knowledge that you could learn while sitting in a chair without having to practice finely-tuned motor coordination. Declarative memories, such as knowing all 50 states and the District of Columbia or memorizing SAT vocab words, can easily be described on a written test. On the flip side, procedural memory encompasses things like playing a musical instrument or riding a bicycle, which everyone must learn by actually performing the task. Over time, procedural memory becomes automatized in unconscious ways through practice, practice, practice. Over a decade ago, when I created “The Athlete’s Way” program to optimize sports performance, the foundation of my coaching method was to take a dual-pronged approach that targeted declarative (explicit) memory and procedural (implicit) memory separately. Notably, the discovery that ancient brain circuits are used to learn a language—and are also used to master sports—corroborates that these neurocognitive systems have multiple purposes. "These brain systems are also found in animals. For example, rats use them when they learn to navigate a maze," co-author Phillip Hamrick of Kent State University in Ohio said in a statement. "Whatever changes these systems might have undergone to support language, the fact that they play an important role in this critical human ability is quite remarkable." Interestingly, results of this analysis showed that memorizing vocabulary words used in a language relied on declarative memory. However, grammar and syntax, which allow us to fluidly combine words into sentences that follow the rules of a language, relies more on procedural memory. When acquiring their native language, children utilize procedural memory to master grammar and syntax without necessarily “knowing” the rules. However, when adults begin to learn a second language, grammatical rules are initially memorized using declarative memory. As would be expected, grammar and syntax switch to procedural memory systems at later stages of language acquisition as someone becomes more fluent. "The findings have broad research, educational, and clinical implications" co-author Jarrad Lum of Deakin University in Melbourne, Australia said in a statement. Read more: Psychology Today

How you learned a second language influences the way your brain works

March 15th, 2017 by Over the past few years, you might have noticed a surfeit of articles covering current research on bilingualism. Some of them suggest that it sharpens the mind, while others are clearly intended to provoke more doubt than confidence, such as Maria Konnikova’s “Is Bilingualism Really an Advantage?” (2015) in The New Yorker. The pendulum swing of the news cycle reflects a real debate in the cognitive science literature, wherein some groups have observed effects of bilingualism on non-linguistic skills, abilities and function, and others have been unable to replicate these findings. Despite all the fuss that has been made about the “bilingual advantage,” most researchers have moved on from the simplistic ‘is there an advantage or not’ debate. Rather than asking whether bilingualism per se confers a cognitive advantage, researchers are now taking a more nuanced approach by exploring the various aspects of bilingualism to better understand their individual effects. To give an idea of the nuances I am talking about, consider this: there is more than one type of bilingualism. A “simultaneous bilingual” learns two languages from birth; an “early sequential bilingual” might speak one language at home but learn to speak the community language at school; and a “late sequential bilingual” might grow up with one language and then move to a country that speaks another. The differences between these three types are not trivial—they often lead to different levels of proficiency and fluency in multiple aspects of language, from pronunciation to reading comprehension. Read more: Quartz

6 Potential Brain Benefits Of Bilingual Education

November 29th, 2016 by Brains, brains, brains. One thing we've learned at NPR Ed is that people are fascinated by brain research. And yet it can be hard to point to places where our education system is really making use of the latest neuroscience findings. But there is one happy nexus where research is meeting practice: Bilingual education. "In the last 20 years or so, there's been a virtual explosion of research on bilingualism," says Judith Kroll, a professor at the University of California, Riverside. Again and again, researchers have found, "bilingualism is an experience that shapes our brain for a lifetime," in the words of Gigi Luk, an associate professor at Harvard's Graduate School of Education. At the same time, one of the hottest trends in public schooling is what's often called dual-language or two-way immersion programs. Read more: NPR

“Whistled Languages” Reveal How the Brain Processes Information

November 22nd, 2016 by Before electronic communications became a ubiquitous part of people's lives, rural villagers created whistled versions of their native languages to speak from hillside to hillside or even house to house. Herodotus mentioned whistled languages in the fourth book of his work The Histories, but until recently linguists had done little research on the sounds and meanings of this now endangered form of communication. New investigations have discovered the presence of whistled speech all over the globe. About 70 populations worldwide communicate this way, a far greater number than the dozen or so groups that had been previously identified. Linguists have tried to promote interest in these languages—and schools in the Canary Islands now teach its local variant. A whistled language represents both a cultural heritage and a way to study how the brain processes information. Read more: Scientific American

Brain ‘reads’ sentence same way in 2 languages

November 7th, 2016 by When the brain “reads” or decodes a sentence in English or Portuguese, its neural activation patterns are the same, researchers report. Published in NeuroImage, the study is the first to show that different languages have similar neural signatures for describing events and scenes. By using a machine-learning algorithm, the research team was able to understand the relationship between sentence meaning and brain activation patterns in English and then recognize sentence meaning based on activation patterns in Portuguese. The findings can be used to improve machine translation, brain decoding across languages, and, potentially, second language instruction. “This tells us that, for the most part, the language we happen to learn to speak does not change the organization of the brain,” says Marcel Just, professor of psychology at Carnegie Mellon University. “Semantic information is represented in the same place in the brain and the same pattern of intensities for everyone. Knowing this means that brain to brain or brain to computer interfaces can probably be the same for speakers of all languages,” Just says. Read more: Futurity

How language is processed by your brain

August 17th, 2016 by If you read a sentence (such as this one) about kicking a ball, neurons related to the motor function of your leg and foot will be activated in your brain. Similarly, if you talk about cooking garlic, neurons associated with smelling will fire up. Since it is almost impossible to do or think about anything without using language -- whether this entails an internal talk-through by your inner voice or following a set of written instructions -- language pervades our brains and our lives like no other skill. For more than a century, it's been established that our capacity to use language is usually located in the left hemisphere of the brain, specifically in two areas: Broca's area (associated with speech production and articulation) and Wernicke's area (associated with comprehension). Damage to either of these, caused by a stroke or other injury, can lead to language and speech problems or aphasia, a loss of language. In the past decade, however, neurologists have discovered it's not that simple: language is not restricted to two areas of the brain or even just to one side, and the brain itself can grow when we learn new languages. Read more: CNN

A ‘lost’ first language can influence how your brain processes sounds

December 8th, 2015 by Offering further proof of the impressionable nature of the newborn brain, researchers studying the influence of early language have discovered a life-long impact. In a paper published this week in Nature Communications, scientists from McGill University and the Montreal Neurological Institute explain how even brief exposure to language permanently wires how our brains process sounds and second languages later in life. So even if you can't remember a lick of the French your parents spoke to you as a newborn, that brief exposure to the dialect makes your brain's handling of a second language different from someone who has spoken or heard only one language. Read more: Mother Nature Network‎