Part of Music Cognition textbook Foundations in Music Psychology: Theory and Research.
Chapter 15: Music and Movement
McGarry, L., Sternin, A., & Grahn, J.
The next time you put on a good song, look around you. If your audience is receptive, you will probably notice some people start to bob their heads or tap their feet, without much prompting. In fact, music and movement are inextricably linked. Even when we are not explicitly moving along, motor areas of our brain are active and muscles are measurably potentiated during music listening. When we create music, we move our bodies to do so, and this movement is heard and felt by the listener. Music-induced movement has been observed across all cultures (Nettl, 2005) and is generally synchronized to the perceived pulse, or “beat” of the music (Cooper & Meyer, 1960). For an in-depth explanation of beat see chapter 2 of this volume.
The body’s physical properties and movements can influence how we perceive music. Studying the body’s influence on music perception is often termed embodied music cognition (Leman & Maes, 2015). Research in embodied cognition is complemented by research in neuroscience that examines the neural mechanisms relating music perception and movement. The current chapter will explore the links between music and movement. We will discuss behavioral and neuroscientific evidence that characterizes how and why we move to music, as well as applied work that exploits this relationship to optimize or rehabilitate movement.
Synchronizing gait to music-based auditory cues (rhythmic auditory stimulation) is a strategy used to manage gait impairments in a variety of neurological conditions, including Parkinson’s disease. However, knowledge of how to individually optimize music-based cues is limited. The purpose of this study was to investigate how instructions to synchronize with auditory cues influences gait outcomes among healthy young adults with either good or poor beat perception ability. 65 healthy adults walked to metronome and musical stimuli with high and low levels of perceived groove (how much it induces desire to move) and familiarity at a tempo equivalent to their self-selected walking pace. Participants were randomized to instruction conditions: (i) synchronized: match footsteps with the beat, or (ii) free-walking: walk comfortably. Participants were classified as good or poor beat perceivers using the Beat Alignment Test. In this study, poor beat perceivers show better balance-related parameters (stride width and double-limb support time) when they are not instructed to synchronize their gait with cues (versus when synchronization was required). Good beat perceivers, in contrast, were better when instructed to synchronize gait (versus when no synchronization was required). Changes in stride length and velocity were influenced by musical properties, in particular the perceived ‘groove’ (greater stride length and velocity with high- versus low-groove cues) and, in some cases, this interacted with beat perception ability. The results indicate that beat perception ability and instructions to synchronize indeed influence spatiotemporal gait parameters when walking to music- and metronome-based rhythmic auditory stimuli. Importantly, these results suggest that both low groove cues and instructing poor beat perceivers to synchronize may interfere with performance while walking, thus potentially impacting both empirical and clinical outcomes.
Background: Humans spontaneously mimic the facial expressions of others, facilitating social interaction. This mimicking behavior may be impaired in individuals with Parkinson’s disease, for whom the loss of facial movements is a clinical feature.
Objective: To assess the presence of facial mimicry in patients with Parkinson’s disease.
Method: Twenty-seven non-depressed patients with idiopathic Parkinson’s disease and 28 age-matched controls had their facial muscles recorded with electromyography while they observed presentations of calm, happy, sad, angry, and fearful emotions.
Results: Patients exhibited reduced amplitude and delayed onset in the zygomaticus major muscle region (smiling response) following happy presentations (patients M = 0.02, 95% confidence interval [CI] −0.15 to 0.18, controls M = 0.26, CI 0.14 to 0.37, ANOVA, effect size [ES] = 0.18, p < 0.001). Although patients exhibited activation of the corrugator supercilii and medial frontalis (frowning response) following sad and fearful presentations, the frontalis response to sad presentations was attenuated relative to controls (patients M = 0.05, CI −0.08 to 0.18, controls M = 0.21, CI 0.09 to 0.34, ANOVA, ES = 0.07, p = 0.017). The amplitude of patients’ zygomaticus activity in response to positive emotions was found to be negatively correlated with response times for ratings of emotional identification, suggesting a motor-behavioral link (r= –0.45, p = 0.02, two-tailed).
Conclusions: Patients showed decreased mimicry overall, mimicking other peoples’ frowns to some extent, but presenting with profoundly weakened and delayed smiles. These findings open a new avenue of inquiry into the “masked face” syndrome of PD.
When we see or hear another person execute an action, we tend to automatically simulate that action. Evidence for this has been found at the neural level, specifically in parietal and premotor brain regions referred to collectively as the mirror neuron system (MNS), and the behavioural level, through an observer’s tendency to mimic observed movements. This simulation process may play a key role in emotional understanding. It is currently unclear the extent to which the MNS is driven by bottom-up automatic recruitment of movement simulation, or by top-down (task driven) mechanisms. The present dissertation examines the role of the MNS in the bottom-up and top-down processing of action in the auditory and visual modalities, in response to emotional and neutral movements performed by humans, including music. Study 1 (published in Experimental Brain Research; McGarry, Russo, Schalles & Pineda, 2012), used EEG to demonstrate that the MNS is affected by bottom-up manipulations of modality, and shows that the MNS is activated to a greater extent towards multi-modal versus unimodal sensory input. Study 2 (published in Cognitive, Affective, and Behavioral Neuroscience; McGarry, Pineda & Russo, 2015) employed an EEG paradigm utilizing a top-down emotion judgment manipulation of musical pitch intervals that varied in emotionality and pitch distance between notes. It was found that the left STG, part of the extended MNS, is affected by top-down manipulations of emotionality (judgment type: emotion or pitch distance), but there were no areas in classical MNS that met the statistical threshold to be affected by top-down mechanisms. Study 3 employed an fMRI paradigm combining bottom-up and top-down manipulations. It was found that the classical MNS was strongly affected by bottom-up differences in emotionality and modality, and minimally affected by the top-down manipulation. Together, the three studies presented in this dissertation support the premise that the classical mirror neuron system is primarily automatic. More research is needed to determine whether top-down manipulations can uniquely engage the MNS.
In the present study, we examined the involvement of the extended mirror neuron system (MNS)—specifically, areas that have a strong functional connection to the core system itself—during emotional and nonemotional judgments about human song.We presented participants with audiovisual recordings of sung melodic intervals (two-tone sequences) and manipulated emotion and pitch judgments while keeping the stimuli identical. Mu event-related desynchronization (ERD) was measured as an index of MNS activity, and a source localization procedure was performed on the data to isolate the brain sources contributing to this ERD. We found that emotional judgments of human song led to greater amounts of ERD than did pitch distance judgments (nonemotional), as well as control judgments related to the singer’s hair, or pitch distance judgments about a synthetic tone sequence. Our findings support and expand recent research suggesting that the extended MNS is involved to a greater extent during emotional than during nonemotional perception of human action.
Previous studies demonstrate that perception of action presented audio-visually facilitates greater mirror neuron system (MNS) activity in humans (Kaplan and Iacoboni in Cogn Process 8(2):103–113, 2007 ) and non-human primates (Keysers et al. in Exp Brain Res 153(4):628–636, 2003 ) than perception of action presented unimodally. In the current study, we examined whether audio-visual facilitation of the MNS can be indexed using electroencephalography (EEG) measurement of the mu rhythm. The mu rhythm is an EEG oscillation with peaks at 10 and 20 Hz that is suppressed during the execution and perception of action and is speculated to reflect activity in the premotor and inferior parietal cortices as a result of MNS activation (Pineda in Behav Brain Funct 4(1):47, 2008 ). Participants observed experimental stimuli unimodally (visual-alone or audio-alone) or bimodally during randomized presentations of two hands ripping a sheet of paper, and a control video depicting a box moving up and down. Audio-visual perception of action stimuli led to greater event-related desynchrony (ERD) of the 8–13 Hz mu rhythm compared to unimodal perception of the same stimuli over the C3 electrode, as well as in a left central cluster when data were examined in source space. These results are consistent with Kaplan and Iacoboni’s (in Cogn Process 8(2):103–113, 2007 ) findings that indicate audio-visual facilitation of the MNS; our left central cluster was localized approximately 13.89 mm away from the ventral premotor cluster identified in their fMRI study, suggesting that these clusters originate from similar sources. Consistency of results in electrode space and component space support the use of ICA as a valid source localization tool.
Memory for emotional events is usually very good even when tested shortly after study,
before it is altered by the influence of emotional arousal on consolidation. Immediate emotion-
enhanced memory may stem from the influence of emotion on cognitive processes at
encoding and retrieval. Our goal was to test which cognitive factors are necessary and sufficient
to account for EEM, with a specific focus on clarifying the contribution of attention
to this effect. In two experiments, participants encoded negative-arousing and neutral pictures.
In Experiment 1, under divided-attention conditions, negative pictures were better
attended and recalled even when they were matched with neutral pictures on semantic
relatedness and distinctiveness, and attention at encoding predicted subsequent emotion-
enhanced memory. The memory advantage for emotional stimuli was only abolished
when attention to emotional and neutral stimuli was also matched, under full-attention in
Experiment 1 and under divided-attention in Experiment 2. Emotional memory enhancement
was larger in Experiment 1 when the control of organization and distinctiveness
was relaxed. These findings suggest that attention, organization and distinctiveness provide
a necessary and sufficient account for immediate emotion-enhanced free recall
Mirroring, an exercise practiced in Dance/Movement Therapy (DMT), is considered by practitioners and
patients to enhance emotional understanding and empathy for others. Mirroring involves imitation by
the therapist of movements, emotions, or intentions implied by a client’s movement, and is commonly
practiced in order to enhance empathy of the therapist for the client. Despite enthusiastic claims for its
effectiveness, a clear theoretical framework that would explain the effects of mirroring on empathy has
not yet been presented, and empirical research on the topic is generally lacking. In this review, we propose
that mirroring in DMT enhances understanding of others’ emotional intentions through enhanced use
of mirror neuron circuitry. Research on the mirror neuron system (MNS) suggests that the brain areas
involved in perception and production of movement overlap, and that these brain areas are also involved
in the understanding of movement intention (Rizzolatti & Craighero, 2004). One important route to
emotion recognition involves a neural simulation of another person’s emotional actions in order to infer
the intentions behind those actions, and empathize with them. Future research is proposed in order to
systematically explore the effectiveness of mirroring in dance therapy, the neural mechanisms behind it,
and its applicability to patient populations who have problems with empathy.
Emotional events are more organized and distinctive than neutral events. We asked whether organization and distinctiveness can account for emotionally-enhanced memory. To examine organization, we compared memory for arousing, negatively-valenced pictures, and inter-related neutral pictures. To examine distinctiveness, we manipulated list composition, and compared mixed lists, which contained emotional and neutral items, to pure lists, which contained only items of a single type and removed the relative-distinctiveness advantage of emotional items. We show that emotional memory is enhanced in immediate memory tests as long as either organization or distinctiveness is allowed to play a role. When these effects are removed, in the comparison of emotional and related neutral items in pure lists, the emotional memory advantage is eliminated. Examining the contribution of mediating cognitive factors at a behavioral and neural level is crucial if we are to understand how emotion influences memory.