-
Views
-
Cite
Cite
Benjamin Fox, Joy Jiang, Sajila Wickramaratne, Patricia Kovatch, Mayte Suarez-Farinas, Neomi A Shah, Ankit Parekh, Girish N Nadkarni, A foundational transformer leveraging full night, multichannel sleep study data accurately classifies sleep stages, Sleep, 2025;, zsaf061, https://doi.org/10.1093/sleep/zsaf061
- Share Icon Share
Abstract
To evaluate whether a foundational transformer using 8-hour, multichannel polysomnogram (PSG) data can effectively encode signals and classify sleep stages with state-of-the-art performance.
The Sleep Heart Health Study, Wisconsin Sleep Cohort, and Osteoporotic Fractures in Men (MrOS) Study Visit 1 were used for training, and the Multi-Ethnic Study of Atherosclerosis (MESA), Apnea Positive Pressure Long-term Efficacy Study (APPLES), and MrOS visit 2 served as independent test sets. We developed PFTSleep, a self-supervised foundational transformer that encodes full night sleep studies with brain, movement, cardiac, oxygen, and respiratory channels. These representations were used to train another model to classify sleep stages. We compared our results to existing methods, examined differences in performance by varying channel input data and training dataset size, and investigated an AI explainability tool to analyze decision processes.
PFTSleep was trained with 13,888 sleep studies and tested on 4,169 independent studies. Cohen’s Kappa scores were 0.81 for our held-out set, 0.59 for APPLES, 0.60 for MESA, and 0.75 for MrOS Visit 2. Performance increases to 0.76 on a held-out MESA set when MESA is included in the training of the classifier head but not the transformer. Compared to other state-of-the-art AI models, our model shows high performance across diverse datasets while only using task agnostic PSG representations from a foundational transformer as input for sleep stage classification.
Full night, multichannel PSG representations from a foundational transformer enable accurate sleep stage classification comparable to state-of-the-art AI methods across diverse datasets.

Comments