Yesterday, I was thrilled to be awarded the UC Davis Award for Excellence in Postdoctoral Research!
Author: Michelle Cohn
2021 Picnic Day Booth: Speech Science
Come learn about an interdisciplinary research project exploring how adults and kids talk to Amazon’s Alexa, compared to how they talk to a human. You’ll see an example of the experiment, meet the team, and get a behind-the-scenes look at the research process!
Interested in participating? http://phonlab.ucdavis.edu/participate
CBS-13 Interview & Press release
Today, UC Davis published a press release and we did an interview with CBS-13 Sacramento on our project, Intelligibility of face-masked speech depends on speaking style: Comparing casual, smiled, and clear speech.
My co-authors, Anne Pycha (University of Wisconsin-Milwaukee) and Georgia Zellou (UC Davis), and I had a blast working together on a new project: how wearing a fabric face mask (as is common these days) affects speech intelligibility.
[Take away: masks don’t simply reduce intelligibility! The speaker plays an important role]
Click here to read the paper in ‘Cognition’

Two posters at LSA 2021
Come see us present two of our projects tomorrow, Saturday, January 9th at the Linguistic Society of America (LSA) 2021 Annual Meeting!
Prosodic focus in human- versus voice-AI-directed speech (11-12:30pm)
Eleonora Beier, Michelle Cohn, Fernanda Ferreira, Georgia Zellou
In this study, we test whether speakers differ in how they prosodically mark focus in speech directed toward an adult human versus a voice activated artificially intelligent (voice-AI) system (here, Amazon’s Alexa). Overall, we found that speakers prosodically mark focus similarly for both types of interlocutors; this suggests that speakers may view voice-AI (e.g., Alexa) to be a rational listener who will benefit from prosodic focus marking. At the same time, there were several targeted differences by focus type, which suggests that speakers can change their use of prosodic focus marking based on the perceived properties of the listener.
- Selected as a newsworthy project and included in the LSA Press Release
The Interaction between Phonological & Semantic Usage Factors in Dialect Intelligibility in Noise (2-3:30pm)
Riley Stray, Michelle Cohn, & Georgia Zellou
This study examines how an “American” or “British” meaning of a word (e.g., “chips”) spoken in different accents (GB, US) can affect speech-in-noise intelligibility. Overall, we found the British speaker was more intelligible producing British sentences, but also that intelligibility decreased as sentences became more stereotypically British. Results suggest that both phonological and semantic properties of phrases impact speech intelligibility of words across dialects, and that a particular semantic usage in a less familiar dialect can decrease intelligibility as sentences become less predictable.
New paper in Cognition!
Our paper, Intelligibility of face-masked speech depends on speaking style: Comparing casual, smiled, and clear speech, was accepted to Cognition today!
My co-authors, Anne Pycha (University of Wisconsin-Milwaukee) and Georgia Zellou (UC Davis), and I had a blast working together on a new project: how wearing a fabric face mask (as is common these days) affects speech intelligibility.
[Take away: masks don’t simply reduce intelligibility! The speaker plays an important role]
New Frontiers paper on voice-AI!
We’re thrilled that our paper, Age- and gender-related differences in speech alignment toward humans and voice-AI, was accepted at Frontiers in Communication: Language Sciences today!
https://www.frontiersin.org/articles/10.3389/fcomm.2020.600361/abstract
Amazon Research Grant Awarded!
I’m thrilled to that our project, “Speech entrainment during socialbot conversations“, has been funded with an Amazon Research Grant ($46,485). PI: Georgia Zellou, co-PI: Michelle Cohn.
LSA 2021
Two upcoming presentations at the 2021 Linguistics Society of America (LSA) Annual Meeting:
- Eleonora Beier, Michelle Cohn, Fernanda Ferreira, & Georgia Zellou: Prosodic focus in human- versus voice-AI-directed speech (Saturday, January 9, 2021 – 11:00am to 12:30pm)
- Riley Stray, Michelle Cohn, & Georgia Zellou: The Interaction between Phonological and Semantic Usage Factors in Dialect Intelligibility in Noise (Saturday, January 9, 2021 – 2:00pm to 3:30pm)
UCD HCI Research Group Website
Check out the new UC Davis Human-Computer Interaction (HCI) Research Group website (now hosted by UCD Sitefarm).
https://hciresearchgroup.sf.ucdavis.edu/
New UCD HCI Research Group
In Fall 2020, I launched the UC Davis HCI Research Group: a collective of professors, postdocs, graduate students, and undergraduate students across campus investigating the dynamics of human-computer interaction.
We have a quarterly talk series (on Zoom):
Fall Quarter 2020

To join the mailing list to receive updates and the Zoom links, please email Michelle Cohn (mdcohn@ucdavis.edu)
