NYUAD study into sign language and brain function 'shows how similar people are'

Studies involving New York University Abu Dhabi are helping to provide a detailed localisation for certain neural processes involved in language

A laser scanner produces a map of the brain for NYUAD research. Courtesy Jin S Lee
Powered by automated translation

Go back 2,300 years and the ancient Greek philosopher Aristotle was arguing that the purpose of the brain was to help to cool the blood. He thought that consciousness was located in our hearts.

Although outlandish, it was perhaps not as ridiculous a suggestion as it appears, given how rich in blood vessels the brain is.

It has, of course, long since been established that our mental processes take place in the brain and not the heart, and even before Aristotle’s time many took this view.

While our knowledge of the structure of the brain and the way in which various functions are located within it has moved on immeasurably, our most complex organ has still not yielded all of its secrets to science. So researchers remain busy attempting to more precisely localise particular functions.

Studies involving New York University Abu Dhabi are helping to provide just such a detailed localisation for certain neural processes involved in language.

In a fascinating 2015 study published in Brain & Language, Professor Liina Pylkkanen, a professor of linguistics and psychology at NYU and an associate faculty member at NYU Abu Dhabi, led a team who found that, when a person composes basic phrases, a part of the brain called the left anterior temporal lobe (LATL) was activated in similar ways for many different types of phrases.

There are four main lobes in the brain’s cerebral cortex (the part of the brain where higher thought processes happen), of which the temporal lobe is one. As indicated by its name, the LATL is a frontal (anterior) section of the left temporal lobe.

In the study, co-authored by NYU researchers Dr Masha Westerlund (now director of data science at Investopedia in New York), Dr Itamar Kastner (now at Humboldt University in Berlin) and Dr Meera Al Kaabi (now an assistant professor at United Arab Emirates University in Al Ain), the same activation pattern – in terms of location and timing – was seen regardless of whether the person was reading English or Arabic.

_______________

Read more:

Abu Dhabi study sheds new light on benefits of children being bilingual

How the mysteries of language are being mapped

_______________

That the neural processes involved in reading phrases in Arabic and English are similar might not seem surprising. After all, in both cases people are understanding written words.

It perhaps seems harder to predict whether this similarity still holds true if, instead of reading, people are generating language and if instead of speaking, the produced language is a sign language. Are the processes in the brain used to generate phrases in a sign language the same as those used to produce spoken phrases?

A new paper looking at this has been published in the journal Scientific Reports. Much of the experimental work was carried out at NYU Abu Dhabi by the lead author, Esti Blanco-Elorrieta. The paper is co-written by Prof Pylkkanen, who is the senior author, along with Dr Kastner and a sign language specialist at San Diego State University, Professor Karen Emmorey.

The study compared results from 11 deaf sign language users in New York and 11 hearing English speakers living in Abu Dhabi. These were NYU Abu Dhabi students or faculty members who had recently moved to the Emirates. They were monolingual and spoke little or no Arabic.

As in the Arabic-English study, language processing in the brain was measured with magnetoencephalography (MEG), which detects the magnetic fields associated with neuronal currents. MEG allows a detailed measurement of both the timing and location of brain activity. All participants performed many trials to produce a stable pattern.

As when comparing Arabic speakers with English speakers, when comparing English speakers and American Sign Language (ASL) users, the researchers found that activity in the same regions of the brain, and at the same time, was triggered.

Finding no difference between ASL and English might not appear to be an exciting result, but Prof Pylkkanen said it was important.

“On the one hand, it’s a boring replication, but on the other hand, it’s amazing because we’re seeing similarity in the face of so much difference,” she said.

“We have to keep in mind the groups are different and we ran these studies in separate countries with different MEG machines.”

It demonstrates a deep-rooted similarity in the brain processes when people compose phrases like “blue cup”, whether these are in sign language or in spoken language.

The researchers had been uncertain, said Prof Emmorey, whether the differences in output (speaking vs signing) would affect the type of computation taking place in the brain, and the timing at which it happened after a stimulus.

“The hands are much slower, but this had no effect on the timing,” said Prof Emmorey.

“[The overall result is] a very good piece of evidence that we’re dealing with something very fundamental to language.”

Finding the same result for both spoken and sign language simplifies the interpretation of the results. Had differences been found, trying to understand the causes may not have been easy.

“If we had seen differences, there could have been many, many [reasons for] the differences. If we saw similarities, they are more likely to be caused by similar linguistic processes,” said Prof Pylkkanen.

Last year, The National reported on another study by Prof Pylkkanen and Blanco-Elorrieta in which they found that only artificial or forced language switching engaged the brain regions called the prefrontal cortex (PFC), which is important for inhibition and executive control, and the anterior cingulate cortex (ACC), which is involved in conflict resolution and error monitoring. When bilingual Arabic and English speakers were allowed to switch languages freely, there was no activation of such regions.

Blanco-Elorrieta is now looking again at neural processes associated with bilingualism (including ASL-English bilingualism), which is the main focus of her PhD.

Under Prof Pylkkanen’s supervision, she is trying to understand how concepts are represented in the bilingual brain. For example, is a particular concept in an English speaker’s mind represented in the same way in the brain as it is for an Arabic speaker? Although there are some cases in which the concepts may fully overlap across languages, there are others in which the translation of a word to the other language does not capture exactly the same meaning.

“We’re interested in looking at the underlying representations – how the different mappings between concepts and words co-exist and relate to each other in the bilingual brain,” said Blanco-Elorrieta.