I’m thrilled that, along with co-PIs Dr. Alyssa Weakely (UC Davis Neurology) and Dr. Chen-Nee Chuah (UC Davis Computer Science), our grant was one of 6 selected for funding.
Interpretable Acoustic- and LLM-based Cognitive Impairment Detection in Older Adults.
I’m thrilled that my project, along with Georgia Zellou (Prof., UC Davis Linguistics), was one of four grants selected for a Translated Imminent Grant!
Voices We Trust: How L1 and L2 English Speakers Judge LLM Translation Accuracy
PI: Cohn, Assistant Project Scientist, UC Davis Phonetics Lab, Dept. of Linguistics,
co-PI Zellou, Professor, UC Davis Phonetics Lab, Dept. of LinguisticsÂ
I’m thrilled to start as a Specialist with the Vonk Lab at the UCSF Fein Memory and Aging Center. I will be working on a project investigating automated speech analyses to probe cognitive impairment.
I’m thrilled to have received a Googler-Initiated Research Gift ($50,000) for my work on anthropomorphism and trust in large language models (LLMs). The next stages of the project will investigate cross-linguistic and cross-cultural factors in these interactions.
I’m thrilled that two of my undergraduate research mentees, Prati Mehta and Ben Getz (who I’m also co-advising for his Senior Honors Thesis) presented posters at the UC Davis Undergraduate Research Conference on Friday, April 25th!
Prati Mehta: “Do Wav2vec Embeddings Track Phonetic Differences in Duration?”
Ben Getz: “Silent Center Syllables: A Study in Coarticulation Across Speech Style and Word Type”
Ages 7-12, Lottery Tours, 10:00am, 11:00am, 12:00pm, 1:00pm, capacity 5 each tour Participate in a real speech science experiment! Have you ever wondered how you’re able to understand speech? Or how your mouth and tongue coordinate to produce it? Come participate in a real speech science experiment. The appointment is for 45 minutes: the experiment will take about 5 minutes. After, you’ll see a short presentation on our research and have time for kids (and adults!) to ask questions and get a tour of the lab. Participation in the experiment is voluntary; the study has been approved by the UC Davis Institutional Review Board (IRB) ethics committee. For more information about consent, go to: https://phonlab.ucdavis.edu/child-consent-participate-experiment-volunteer.Sign up at https://hr.ucdavis.edu/departments/worklife-wellness/events/tocs.
We had a wonderful time hosting our Speech Science booth at UC Davis Picnic Day! Adults and kids could learn about spectrograms and try a bot / not experiment.
I was thrilled to be invited as this year’s Yvonne Becker Colloquium Speaker in the Department of Linguistics at Simon Fraser University (May 30, 12:30pm – 2:00pm).
Title: Impact of AI on human language
Abstract: Millions of people now talk to voice-activated artificially intelligent (voice-AI) systems (e.g., Siri, Alexa, Google Assistant, ChatGPT) to complete daily tasks. My research program tests how people (1) talk to, (2) perceive, and (3) learn language from voice-AI. At its core, I ask: is communication with voice-AI similar/distinct from communication with another human? I design experiments to probe behavior, combining methods from psycholinguistics, human-computer interaction, and phonetics. Thus far, I have found that while people produce a distinct technology-directed register, they also attribute human social qualities to the systems (e.g., gender, emotion) and learn speech patterns from them. I discuss these findings in terms of their implications for linguistic diversity and language change.Â