Meta-analysis of language & music auditory processing: In an effort to test whether trained musicians (relative to nonmusicians) show different patterns of cerebral lateralization for speech and music perception, I conducted a Activation Likelihood Estimation (ALE) (Turkletaub et al., 2002) analysis. This method uses coordinates reported for particular contrasts (e.g., speech sounds vs. noise, sinusoidal tones, etc.) in published fMRI/PET studies to explore shared areas and networks of activation. While additional research is necessary, these preliminary findings suggest that musical training may drive a more bilateral pattern of activation for both speech and music perception, while nonmusicians show a more canonical left lateralization for language and right lateralization for music. I presented this research at the Society for the Neurobiology of Language Conference (October 2015, Chicago) and at the UC Davis Symposium on Language Research (April 2015, Davis). This project also won the UC Davis Department of Linguistics Lapointe Award in 2015.
Neural language processing in musicians vs. nonmusicians: An investigation of the ‘visual word form area’ (with David Corina & Laurel Lawyer): In 2014 we developed a pilot study examining the effects of language and musical experience on the visual processing pathways in the putative ‘visual word form area’ (VWFA) — a region in the left fusiform gyrus that preferentially responds to visually-presented language. Our preliminary results suggest that extensive musical training may have an effect on patterns of lateralization for word reading. I’ve presented this work at the Society for the Neurobiology of Language Conference (August 2014, Amsterdam), UC Davis Symposium on Language Research (April 2014, Davis), and the Interdisciplinary Graduate and Professional Symposium (April 2014, Davis), where it won the Chancellor’s Grand Prize for Best Oral Presentation ($5,000) and Dean’s Prize for Best Oral Presentation in Social Sciences ($1,000).
Distinguishing gesture processing from sign language processing: The contributions of the superior temporal lobe (with David Corina, Laurie Lawyer, and Shane Blau): An fMRI investigation of linguistically-relevant signs in American Sign Language (ASL) as compared to non-linguistic self-grooming gestures. Our analysis focused on the posterior superior temporal gyrus (pSTG) as there has been evidence to suggest that this area is particularly sensitive to linguistic auditory input. Our question was whether linguistically- relevant input from a visual domain, as in the case of ASL, would also activate this region. We’ve presented this work at the International Society for Gesture Studies Conference (July 2014, San Diego) and the Cognitive Neuroscience Society Annual Meeting (April 2014, Boston).