基于个性化语音动态特征的持续性心力衰竭远程监测 / Continuous Telemonitoring of Heart Failure using Personalised Speech Dynamics
1️⃣ 一句话总结
这项研究提出了一种名为LIPT的新方法,通过分析患者个人语音变化的轨迹来远程监测心力衰竭,其个性化模型能更准确地预测病情恶化,准确率高达99.7%,为家庭远程监护提供了高效、可扩展的解决方案。
Remote monitoring of heart failure (HF) via speech signals provides a non-invasive and cost-effective solution for long-term patient management. However, substantial inter-individual heterogeneity in vocal characteristics often limits the accuracy of traditional cross-sectional classification models. To address this, we propose a Longitudinal Intra-Patient Tracking (LIPT) scheme designed to capture the trajectory of relative symptomatic changes within individuals. Central to this framework is a Personalised Sequential Encoder (PSE), which transforms longitudinal speech recordings into context-aware latent representations. By incorporating historical data at each timestamp, the PSE facilitates a holistic assessment of the clinical trajectory rather than modelling discrete visits independently. Experimental results from a cohort of 225 patients demonstrate that the LIPT paradigm significantly outperforms the classic cross-sectional approaches, achieving a recognition accuracy of 99.7% for clinical status transitions. The model's high sensitivity was further corroborated by additional follow-up data, confirming its efficacy in predicting HF deterioration and its potential to secure patient safety in remote, home-based settings. Furthermore, this work addresses the gap in existing literature by providing a comprehensive analysis of different speech task designs and acoustic features. Taken together, the superior performance of the LIPT framework and PSE architecture validates their readiness for integration into long-term telemonitoring systems, offering a scalable solution for remote heart failure management.
基于个性化语音动态特征的持续性心力衰竭远程监测 / Continuous Telemonitoring of Heart Failure using Personalised Speech Dynamics
这项研究提出了一种名为LIPT的新方法,通过分析患者个人语音变化的轨迹来远程监测心力衰竭,其个性化模型能更准确地预测病情恶化,准确率高达99.7%,为家庭远程监护提供了高效、可扩展的解决方案。
源自 arXiv: 2602.19674