基于多模态MRI报告发现监督的脑病灶与亚结构分割 / Multimodal MRI Report Findings Supervised Brain Lesion Segmentation with Substructures
1️⃣ 一句话总结
本文提出了一种名为MS-RSuper的新方法,它利用放射科报告中不完整、不确定的定性描述(如‘轻度’、‘可能’)和定量线索(如最大病灶尺寸),来指导脑肿瘤多模态MRI图像的病灶分割,而无需依赖大量精确的像素级标注,并在大规模数据集上验证了其优越性。
Report-supervised (RSuper) learning seeks to alleviate the need for dense tumor voxel labels with constraints derived from radiology reports (e.g., volumes, counts, sizes, locations). In MRI studies of brain tumors, however, we often involve multi-parametric scans and substructures. Here, fine-grained modality/parameter-wise reports are usually provided along with global findings and are correlated with different substructures. Moreover, the reports often describe only the largest lesion and provide qualitative or uncertain cues (``mild,'' ``possible''). Classical RSuper losses (e.g., sum volume consistency) can over-constrain or hallucinate unreported findings under such incompleteness, and are unable to utilize these hierarchical findings or exploit the priors of varied lesion types in a merged dataset. We explicitly parse the global quantitative and modality-wise qualitative findings and introduce a unified, one-sided, uncertainty-aware formulation (MS-RSuper) that: (i) aligns modality-specific qualitative cues (e.g., T1c enhancement, FLAIR edema) with their corresponding substructures using existence and absence losses; (ii) enforces one-sided lower-bounds for partial quantitative cues (e.g., largest lesion size, minimal multiplicity); and (iii) adds extra- vs. intra-axial anatomical priors to respect cohort differences. Certainty tokens scale penalties; missing cues are down-weighted. On 1238 report-labeled BraTS-MET/MEN scans, our MS-RSuper largely outperforms both a sparsely-supervised baseline and a naive RSuper method.
基于多模态MRI报告发现监督的脑病灶与亚结构分割 / Multimodal MRI Report Findings Supervised Brain Lesion Segmentation with Substructures
本文提出了一种名为MS-RSuper的新方法,它利用放射科报告中不完整、不确定的定性描述(如‘轻度’、‘可能’)和定量线索(如最大病灶尺寸),来指导脑肿瘤多模态MRI图像的病灶分割,而无需依赖大量精确的像素级标注,并在大规模数据集上验证了其优越性。
源自 arXiv: 2602.20994