product design
|
data visualization

Orcasound Analytics

Design and develop dashboards to evaluate AI vs human performance in detecting orca whale sounds, recognized in the top 3% of projects at the 2024 Microsoft Hackathon.
Data Visualization of Seasonal and Hourly Orca Detection Patterns
Timeline
3 months
Live prototype
Text link
Team
Scott, Product Owner / Marine Bioacoustician
Dave, Marine Ecologist
Dave, Lead Engineer
Brendan, UX Researcher
Team
Claire, Product Designer, Microsoft
Saugata, Product Designer, Microsoft
Patrick, Machine Learning Engineer, Microsoft

Working with data from the Orcasound hydrophone network in the Puget Sound / Salish Sea, I analyzed and compared reports of orca activity made by an AI audio interpretation model versus an online community of citizen scientists. The results let to immediate insights into the improvement of both the AI model and the human listening experience.

Empathizing with users

User analytics reports showed that the citizen science community consistently shows up strongly when there is a live listening event, even without receiving an alert, using networking platforms like Facebook and Whatsapp to spread word of orca sightings and movements.

Analyzing AI model performance

A careful comparison of the data shows that the AI model, trained to detect orca calls from audio based on training data provided by professional scientists, has not performed as well as citizen scientist community listeners. An important reason for this is the greater context human listeners have from other data sources, such as sighting networks, as opposed to pure audio.