Link Between Music and Language

I’ve written before about Dr. Charles Limb’s research using an fMRI scanner to study the brains of jazz musicians while in the act of improvisation. He’s now published some new research that, according to the editor who wrote the headline, supports the language/music link in our brain. He conducted his research by designing keyboards without any metal and scanning the brains of jazz pianists playing scales and trading fours.

That conversation-like improvisation activated brain areas that normally process the syntax of language, the way that words are put together into phrases and sentences. Even between their turns playing, the brain wasn’t resting. The musicians were processing what they were hearing to come up with new sounds that were a good fit.

At the same time, certain other regions of the brain involved with language — those that process the meaning of words — were tuned down, Limb found.

If I recall correctly, similar research showed that when musicians listen to or perform music certain regions in the brain, such as the areas that process vision, are less active than normal. The speculation was that it helps the musician focus on the aural feedback better. These results seems similar in that the regions of the brain responsible for processing language become less active.

What confuses me at this point is how this shows a link between music and language, since different regions in the brain are responsible for a spoken conversation as opposed to a musical conversation. It’s possible that something was left out of the news article, but I know also that editors tend to write the title and frequently choose a misleading headline in order to get readers to click the link. Without going to Limb’s original article, which I’m sure is quite technical and written for neuroscientists, not musicians, it’s hard to say. Either way, it’s another fascinating intersection of music and science.

Lyle Sanford

Dave – not sure this answers your question, but what I think is going on is that they’re saying is that the syntax parsing part of the brain is active with processing both musical language and verbal language – but when in musical mode, the semantic parsing part dims. What I’d love to know is what in music corresponds to the words in spoken language – i.e. what are the bits and pieces that are used to create musical syntax. I think we intuitively know, but spelling it out doesn’t seem so easy.


Thanks, Lyle. That makes sense. I think that perhaps the popular conception about how certain brain regions are responsible for things like “language” and “music” are overly simplistic. I recall that there are many different areas in the brain that must interact together (including the reduction of activity in some areas at the same time) in order to for certain processing to take place effectively.

It’s interesting research and the possibilities that fMRI and other new technologies have to help us understand the brain are exciting.

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.