基于不变性校准的异步联邦遗忘方法用于医学影像 / Asynchronous Federated Unlearning with Invariance Calibration for Medical Imaging
1️⃣ 一句话总结
本文提出了一种名为AFU-IC的新框架,让参与联邦学习的客户端可以在不中断全局训练的情况下,异步地移除自己的数据贡献,并通过服务器端的不变性校准机制防止模型重新学习已删除数据,从而高效、合规地满足“被遗忘权”要求,在医学影像任务中显著减少了等待延迟。
Federated Unlearning (FU) is an emerging paradigm in Federated Learning (FL) that enables participating clients to fully remove their contributions from a trained global model, driven by data protection regulations that mandate the right to be forgotten. However, existing FU methods mostly rely on synchronous coordination. This requirement forces the entire federation to halt and wait for stragglers to complete erasure, creating significant delays due to device heterogeneity. Furthermore, these methods often face the problem that the influence of erased data is merely suppressed temporarily and resurfaces during subsequent training, rather than being genuinely removed. To overcome these limitations, this paper proposes Asynchronous Federated Unlearning with Invariance Calibration (AFU-IC), a novel framework for medical imaging that decouples the erasure process from the global training workflow. This enables the target client to perform unlearning asynchronously without interrupting global training. Meanwhile, a server-side invariance calibration mechanism prevents the model from relearning the erased data. Extensive experiments on three medical benchmarks demonstrate that AFU-IC achieves unlearning efficacy and model fidelity comparable to gold-standard retraining while significantly reducing wall-clock latency compared to synchronous baselines. AFU-IC ensures efficient, compliant and reliable FL in cross-silo medical environments.
基于不变性校准的异步联邦遗忘方法用于医学影像 / Asynchronous Federated Unlearning with Invariance Calibration for Medical Imaging
本文提出了一种名为AFU-IC的新框架,让参与联邦学习的客户端可以在不中断全局训练的情况下,异步地移除自己的数据贡献,并通过服务器端的不变性校准机制防止模型重新学习已删除数据,从而高效、合规地满足“被遗忘权”要求,在医学影像任务中显著减少了等待延迟。
源自 arXiv: 2604.26809