Abstract

Insufficient sleep quality is directly linked to various diseases, making reliable sleep monitoring crucial for prevention, diagnosis, and treatment. As sleep laboratories are cost- and resource-prohibitive, wearable sensors offer a promising alternative for long-term unobtrusive sleep monitoring at home. Current unobtrusive sleep detection systems are mostly based on actigraphy (ACT) that tend to overestimate sleep due to a lack of movement in short periods of wakefulness. Previous research established sleep stage classification by combining ACT with cardiac information but has not investigated the incorporation of respiration in large-scale studies. For that reason, this work aims to systematically compare ACT-based sleep-stage classification with multimodal approaches combining ACT, heart rate variability (HRV) as well as respiration rate variability (RRV) using state-of-the-art machine- and deep learning algorithms. The evaluation is performed on a publicly available sleep dataset including more than 1,000 recordings. Respiratory information is introduced through ECG-derived respiration (EDR) features, which are evaluated against traditional respiration belt data. Results show that including RRV features improves the Matthews Correlation Coefficient (MCC), with long short-term memory (LSTM) algorithms performing best. For sleep staging based on AASM standards, the LSTM achieved a median MCC of 0.51 (0.16 IQR). Respiratory information enhanced classification performance, particularly in detecting Wake and Rapid eye movement (REM) sleep epochs. Our findings underscore the potential of including respiratory information in sleep analysis to improve sleep detection algorithms and, thus, help to transfer sleep laboratories into a home monitoring environment. The code used in this work can be found online at https://github.com/mad-lab-fau/sleep_analysis.

Information Accepted manuscripts
Accepted manuscripts are PDF versions of the author’s final manuscript, as accepted for publication by the journal but prior to copyediting or typesetting. They can be cited using the author(s), article title, journal title, year of online publication, and DOI. They will be replaced by the final typeset articles, which may therefore contain changes. The DOI will remain the same throughout.
This content is only available as a PDF.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.

Comments

0 Comments
Submit a comment
You have entered an invalid code
Thank you for submitting a comment on this article. Your comment will be reviewed and published at the journal's discretion. Please check for further notifications by email.