HEAR THEM ROAR
How humans and chickadees understand one another IS THERE SOMETHING UNIVERSAL about the sounds we make that allows other animals—such as songbirds—to figure out how we’re feeling? Sounds like it, according to a new study by UAlberta scientists. “The idea is that some species (those that are vocal learners) can understand other species’ vocalizations,” explains PhD student Jenna Congdon (’15 MSc, psychology), who led a new study under the supervision of Chris Sturdy (psychology) that showed both humans and black-capped chickadees can detect intense emotions such as fear or excitement in other species. “For instance, a songbird is able to understand the call of distress of a different type of songbird when they are in the presence
of a predator, like an owl or a hawk. Or, for example, if your friend scared you and you screamed. Both of these are higharousal vocalizations, and being able to understand what that sounds like in a different species can be very useful.” The study also found that black-capped chickadees can identify high arousal in giant pandas—despite the fact that they’d never come across one another in the wild.
SOUND MIND
Detecting depression through voice
“W
HEN I WAS A CHILD, I remember wondering what would happen if I cut a magnet in half, trying to separate the poles. Of course, you just end up with two smaller magnets—but what we’re looking for is a fundamental magnetic charge with only one pole,” says James Pinfold (physics). In the same way that an electron is a fundamental electric charge, the magnetic monopole would be the equivalent for magnetism—but it’s never been seen before. Looking for it requires conditions far beyond the ordinary. In this case, disassembling a $1-million beampipe that used to be part of the European Organization for Nuclear Research (CERN) Large Hadron Collider—the world’s largest and most powerful particle accelerator, a 27-kilometre ring of superconducting magnets. The search is still ongoing for the yet-tobe-identified monopole, explains Pinfold. “But we’ve got our fingers crossed.”
AI ALGORITHMS can now more accurately detect depressed mood using the sound of your voice. The research, conducted by PhD student Mashrura Tasnim and her supervisor Eleni Stroulia (computing science), builds on past research that suggests the timbre of our voice contains information about our mood. Using standard benchmark data sets, Tasnim and Stroulia developed a methodology that combines several machine-learning algorithms to recognize depression more accurately using acoustic cues.
PhD student Mashrura Tasnim and professor Eleni Stroulia (pictured) are authors of a study examining AI's ability to detect depression through voice.
The ultimate goal, Stroulia explains, is to develop meaningful applications from this technology. “A realistic scenario is to have people use an app that will collect voice samples as they speak naturally. The app, running on the user’s phone, will recognize and track indicators of mood, such as depression, over time. Much like you have a step counter on your phone, you could have a depression indicator based on your voice as you use the phone.”
UA L B E R TA .C A / S C I E N C E ■ C O N T O U R S
9