top of page

The Physiological Correlates of Music on the Brain

Updated: Nov 4, 2024


Neural Basis of Tonal Processing in Music

To understand how music affects the brain, researchers have delved into the neural basis of tonal processing. One group conducted a detailed analysis using activation likelihood estimation (ALE) to explore this fascinating area. They performed a voxel-based meta-analysis of 20 published fMRI studies focused on tonal processing in music. Their goal was to identify key brain regions activated during this process.


The researchers hypothesized that significant areas involved in tonal processing would include the posterior and anterior superior temporal gyri, as well as the inferior frontal gyrus (specifically the pars orbitalis), particularly in the right hemisphere. The results confirmed their predictions, revealing active regions in the right frontal lobe, including Brodmann Area 47 and the adjacent insula. Interestingly, the only activation observed in the left hemisphere was in the anterior superior temporal gyrus.


The most notable areas of brain activation across these studies were found at the intersection of the inferior frontal gyrus, anterior insula, and orbitofrontal cortex in Brodmann areas 47 and 13, again emphasizing the crucial role of the right hemisphere in tonal processing. This research enhances our understanding of how our brains interpret and respond to music, highlighting the complex neural pathways involved in experiencing melody and harmony.


Decode Musical Training: Understanding the Brain's Response to Music


A fascinating way to explore the neural correlates of music is through the study of how the brain processes musical features in those with musical training. A team of researchers from Finland set out to decode the differences between musicians and non-musicians by analyzing brain activity. They began with a hypothesis grounded in previous research, which suggested that significant distinctions exist between these two groups. They anticipated achieving classification accuracies that were higher than chance (2).


To identify the brain regions that best differentiated musicians from non-musicians, the researchers focused on areas like the bilateral anterior cingulate and paracingulate gyrus, the opercular part of the inferior frontal gyrus, and the right superior temporal gyrus (2). Notably, the anterior cingulate gyrus displayed stronger BOLD (Blood Oxygen Level Dependent) responses in musicians, indicating a greater capacity for music-related working memory. This suggests that musicians have a unique ability to concentrate on relevant musical tasks, likely honed through their extensive training.


This study highlights the potential to decode musicianship based on how individual brains respond to music, achieving accuracy levels comparable to those seen in automated clinical diagnoses for neurological and psychological disorders (2).



Neural Correlates of Familiarity in Music Listening

In 2018, a team of Canadian scientists embarked on an intriguing exploration of how our brains respond to music we know versus music that is new to us. They conducted a systematic review and a neuroimaging meta-analysis to uncover the neural correlates of familiarity in music listening. Research has shown that both familiarity and repetition can enhance our enjoyment of music, often leading to positive emotions. One key finding was that the left superior temporal gyrus plays a crucial role in distinguishing between familiar and unfamiliar melodies.


The researchers meticulously reviewed existing studies involving various neuroimaging techniques, such as fMRI, EEG, PET, ERP, and MEG, to gather insights into how our brains process music familiarity in healthy individuals. Using activation likelihood estimation (ALE) analysis for fMRI and PET studies, they anticipated discovering that brain regions associated with emotion and reward would be most active when listening to familiar music. This expectation stems from the understanding that familiarity typically correlates with greater enjoyment—at least up to a certain point.


Notably, this study marked the first systematic review and ALE meta-analysis focusing specifically on the neural correlates of familiar and unfamiliar music listening. Previous literature on the “mere exposure” effect has highlighted that prior familiarity with music significantly influences our emotional and pleasurable responses. Interestingly, the researchers expected to see heightened activity in limbic structures when participants listened to familiar tunes. However, their findings revealed a surprising motor pattern of activation instead, challenging some of the existing assumptions about how familiarity impacts our musical experiences.




Neural Differences Between Musicians and Non-musicians

A fascinating study has examined the neural differences between musicians and non-musicians, focusing specifically on how musician children are better at detecting pitch violations in both music and language. Researchers used behavioral assessments and electrophysiological methods to explore this ability. They found notable structural differences in the brains of musicians compared to non-musicians, as revealed by MRI scans. These differences included variations in auditory regions like Heschl’s gyrus and the secondary auditory cortex, as well as changes in motor and visuospatial areas in the brain, including the size of the corpus callosum and planum temporale.


These anatomical distinctions have significant functional implications. For instance, studies using fMRI and magnetoencephalography (MEG) have demonstrated increased activity in Heschl’s gyrus among both professional and amateur musicians when compared to non-musicians. Additionally, musicians exhibited heightened somatosensory and motor activity linked to their musical practice and showed larger bilateral activation in the planum temporale. This research highlights how musical training not only enhances auditory skills but also reshapes the brain’s structure and function.



Neural Correlates of Music-Syntactic Processing

Eminent neuroscientist Stefan Koelsch conducted a fascinating study on the neural correlates of music-syntactic processing, focusing on the crucial roles of Broca’s area and the ventral premotor cortex. This research highlights a specific network in the brain—comprising the inferior frontolateral cortex (IFLC, corresponding to Brodmann Area 44), the ventrolateral premotor cortex (vlPMC), and the anterior superior temporal gyrus (aSTG)—that plays a significant role in processing musical structure. This network is believed to be responsible for calculating the harmonic relationships between chords and preceding sequences, detecting structural irregularities in music, and making quick predictions about upcoming musical events.


In a related study, Koelsch explored the neural mechanisms involved in processing both syntax and semantics in music. Increasing evidence suggests that these elements are fundamental to our musical experience. After a chord is played, initial processing of musical syntax occurs within approximately 150 to 400 milliseconds, while musical semantics is processed from about 300 to 500 milliseconds. The brain areas activated during musical syntax processing include the inferior frontolateral cortex, ventrolateral premotor cortex, and likely the anterior part of the superior temporal gyrus. These regions are crucial for sequencing complex auditory information, identifying structural relationships, and making sequential predictions. Meanwhile, the processing of musical semantics appears to engage the posterior superior temporal regions. Notably, the brain structures involved in understanding musical syntax and semantics closely overlap with those used for language perception, underscoring the deep connections between music and language in the human brain.



Neural Correlates of Music and Language Syntax

A team of European researchers set out to explore how music and language syntax interact in the brain, particularly in Broca’s area, using fMRI technology. Both instrumental music and language share a fundamental characteristic: they are syntactic systems that utilize complex, hierarchical sequences built on implicit structural norms. This organization helps listeners decipher the roles of individual words or musical tones within the context of an unfolding sentence or melody.


In their study, the researchers employed an fMRI method alongside an interference paradigm that focused on sung sentences. They discovered that the processing demands of musical syntax (like harmony) and language syntax interact in Broca’s area, specifically in the left inferior frontal gyrus. Notably, this interaction did not lead to separate effects for music and language. Instead, a distinct language effect only appeared during the complex harmony condition, suggesting that language processing becomes more pronounced when faced with heightened demands on shared neural resources.


Unlike previous studies, this research design allowed the scientists to eliminate several potential confounding factors. For instance, they showed that the observed neural interaction was not simply a result of general attention mechanisms, as a psychoacoustic auditory anomaly reacted differently from harmonic manipulations. Additionally, there were no structural errors in either the language or music stimuli, ruling out error processing as an explanation. These findings indicate that music and language, while distinct cognitive domains, may tap into the same high-level syntactic integration resources in Broca’s area.




Studying Neural Correlates of Music: How Pleasant and Unpleasant Pieces Evoke Different Emotions

In their quest to uncover how music affects the brain, researchers recently focused on the neural correlates associated with pleasant and unpleasant musical pieces that evoke a range of emotions. A team of scientists investigated the metabolic and electrical brain patterns triggered by music that elicits both positive and negative feelings. They analyzed the participants’ emotional reactions through various methods, including principal component analysis of validated reports, functional magnetic resonance imaging (fMRI), and electroencephalogram (EEG) coherence (8).


In the study, 19 non-musician volunteers listened to ten consecutive 30-second segments of different musical masterpieces, interspersed with random static noise, for a total of 30 minutes of auditory stimulation (8). The results revealed a left cortical network linked to pleasant emotions, involving the left primary auditory area, posterior temporal regions, inferior parietal areas, and prefrontal cortex (8). This suggests that cognitive areas on the left side of the brain may enhance feelings of pleasure, particularly when melodic sequences adhere to expected patterns (8).


Conversely, unpleasant emotions were associated with activation in the right frontopolar and paralimbic regions (8). When the researchers compared all musical excerpts to the static noise, they observed not only bilateral auditory activation but also increased activity in the left temporal lobe, inferior frontal gyrus, and frontopolar area. This indicates that cognitive and language processes are engaged in our general responses to music (8).

Sources

  1. Asano, Rie et al. The Neural Basis of Tonal Processing: an ALE Meta-Analysis. Music and Science; 5: 1-15. 2022.

  2. Saari, Pasi et al. Decoding Musical Training from Dynamic Processing of Musical Features in the Brain. Nature. 2018.

  3. Freitas, Carin et al. Neural Correlates of Familiarity in Music Listening: A Systematic Review and a Neuroimaging Meta-Analysis. Systematic Review. 2018.

  4. Magne, Cyrille et al. Musician Children Detect Pitch Violations in Both music and Language Better than Non-Musician Children: Behavioral and Electrophysiological Approaches. Journal of Cognitive Neuroscience; 18(2): 199-211. 2006.

  5. Koelsch, Stefan. Significance of Broca's area and Ventral Premotor Cortex for Music-Syntactic Processing. Cortex; 42: 518-520. 2006.

  6. Koelsch, Stefan. Neural substrates of processing syntax and semantics in music. Current Opinion in Neurobiology; 15: 207-212. 2005.

  7. Kunert, Richard et al. Music and Language Syntax in Broca's area: An fMRI Study. PLOS ONE. 2015.

  8. Gutierrz, Enrique et al. Metabolic and electric brain patterns during pleasant and unpleasant emotions induced by music masterpieces. International Journal of Psychophysiology. 2007.




Comments


  • Twitter
  • LinkedIn
  • Instagram

©2022 by Innexa

bottom of page