Tens of thousands of people die every year from cardiac arrest, when the heart suddenly stops beating. People experiencing cardiac arrest will suddenly become unresponsive and either stop breathing or gasp for air, a sign known as agonal breathing. Immediate CPR can double or triple someone’s chance of survival, but that requires someone else to be there.
Researchers at the University of Washington have developed a new tool to monitor people for cardiac arrest while they’re asleep without touching them. A smart speaker, like Google Home or Amazon’s Alexa, will detect the gasping sound of agonal breathing and call for help. On average, the proof-of-concept tool, which was developed using real agonal breathing instances captured from 911 calls, detected agonal breathing events 97 per cent of the time from up to 20 feet (or 6 meters) away. The findings are published June 19 in npj Digital Medicine.
The paper’s co-corresponding author, Shyam Gollakota, said: “A lot of people have smart speakers in their homes. These devices have amazing capabilities that we can take advantage of. We envision a contactless system that works by continuously and passively monitoring the bedroom for an agonal breathing event, and alerts anyone nearby to come and provide CPR. And then if there’s no response, the device can automatically call the emergency services.”
Agonal breathing is present for about 50 per cent of people who experience cardiac arrests, according to 911 call data, and patients who take agonal breaths often have a better chance of surviving.
The researchers gathered sounds of agonal breathing from real emergency service calls. Because cardiac arrest patients are often unconscious, bystanders recorded the agonal breathing sounds by putting their phones up to the patient’s mouth so that the dispatcher could determine whether the patient needed immediate CPR. The team collected 162 calls between 2009 and 2017 and extracted 2.5 seconds of audio at the start of each agonal breath to come up with a total of 236 clips. The team captured the recordings on different smart devices – an Amazon Alexa, an iPhone 5s and a Samsung Galaxy S4 – and used various machine learning techniques to boost the dataset to 7,316 positive clips.
From these datasets, the team used machine learning to create a tool that could detect agonal breathing 97 per cent of the time when the smart device was placed up to 6 meters away from a speaker generating the sounds.
Next the team tested the algorithm to make sure that it wouldn’t accidentally classify a different type of breathing, like snoring, as agonal breathing.
The team envisions this algorithm could function like an app, or a skill for Alexa that runs passively on a smart speaker or smartphone while people sleep. They plan to commercialise this technology through a company called Sound Life Sciences, Inc.