How our brain produces language and thought, say neuroscientists

For thousands of years philosophers have argued about the purpose of language. Plato believed it was necessary for thought. Thought “is the silent inward conversation of the soul with itself,” he wrote.

Many modern scholars have held similar views. Beginning in the 1960s, Noam Chomsky, a linguist at MIT, argued that we use language for reasoning and other forms of thought. “If there is a serious deficit in language, there is a serious deficit in thought,” he wrote.

As a university student, Evelina Fedorenko attended the class of Dr. Chomsky and heard him describe his theory. “I really liked the idea,” she recalled. But she was confused by the lack of evidence. “A lot of the things he was saying were just stated as if they were facts — the truth,” she said.

Dr. Fedorenko became a cognitive neuroscientist at MIT, using brain scans to study how the brain produces language. And after 15 years, her research led her to a surprising conclusion: We don’t need language to think.

“When you start evaluating it, you just don’t find support for this role of language in thought,” she said.

When Dr. Fedorenko began this work in 2009, studies found that the same brain regions needed for language were also active when people reasoned or performed arithmetic.

But Dr. Fedorenko and other researchers discovered that this overlap was a mirage. Part of the problem with the early results was that the scanners were quite crude. The researchers got the most out of their fuzzy scans by combining the results from all their volunteers to create an overall average of brain activity.

In her own research, Dr. Fedorenko used more powerful scanners and performed more tests on each volunteer. These steps allowed her and her colleagues to collect enough data from each person to build a fine-grained picture of an individual’s brain.

The researchers then conducted studies to determine the brain circuits involved in language tasks such as retrieving words from memory and following grammatical rules. In a typical experiment, volunteers read gibberish followed by real sentences. The researchers discovered certain areas of the brain that only became active when the volunteers processed real language.

Each volunteer had a language network — a constellation of regions that are activated during language tasks. “It’s very stable,” said Dr. Fedorenko. “If I scan you today and 10 or 15 years later, it will be in the same place.

The researchers then scanned the same people as they performed different kinds of thinking, such as solving a puzzle. “Other areas in the brain are working really hard when you’re doing all these forms of thinking,” she said. But the language networks remained calm. “It was clear that none of these things seemed to engage language circuits,” she said.

In an article published Wednesday in Nature, Dr. Fedorenko and her colleagues argued that studies of people with brain injuries point to the same conclusion.

Strokes and other forms of brain damage can wipe out the language network, leaving people with difficulty processing words and grammar, a condition known as aphasia. But researchers have found that people can still do algebra and play chess with aphasia. In experiments, people with aphasia can look at two numbers—say, 123 and 321—and recognize that, using the same pattern, 456 should be followed by 654.

If language is not necessary for thought, then what is language for? Communication, say Dr. Fedorenko and her colleagues. Dr. Chomsky and other researchers rejected this idea, pointing to the ambiguity of words and the difficulty of expressing our intuitions out loud. “The system is not well designed in many functional ways,” Dr. Chomsky.

However, large studies suggest that languages ​​have been optimized to convey information clearly and efficiently.

In one study, researchers found that frequently used words are shorter, making language learning easier and the flow of information faster. In another study, researchers who examined 37 languages ​​found that grammar rules put words close together so that their combined meaning is easier to understand.

Kyle Mahowald, a linguist at the University of Texas at Austin who was not involved in the new work, said the separation of thought and language could help explain why AI systems like ChatGPT are so good at some tasks and so bad at others.

Computer scientists train these programs on huge amounts of text and discover rules about how words are connected. Dr. Mahowald suspects that these programs are beginning to mimic the language network in the human brain—but they can’t reason.

“It is possible to have a very fluent grammatical text that may or may not have a coherent underlying idea,” said Dr. Mahowald.

Dr. Fedorenko noted that many people intuitively believe that language is necessary for thinking because they have an inner voice that narrates their every thought. But not everyone has this ongoing monologue. And few studies have investigated this phenomenon.

“I don’t have a model like that yet,” she said. “I didn’t even do what I had to do to speculate in this way.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top