📄
Abstract - Let Robots Feel Your Touch: Visuo-Tactile Cortical Alignment for Embodied Mirror Resonance
Observing touch on another's body can elicit corresponding tactile sensations in the observer, a phenomenon termed mirror touch that supports empathy and social perception. This visuo-tactile resonance is thought to rely on structural correspondence between visual and somatosensory cortices, yet robotic systems lack computational frameworks that instantiate this principle. Here we demonstrate that cortical correspondence can be operationalized to endow robots with mirror touch. We introduce Mirror Touch Net, which imposes semantic, distributional and geometric alignment between visual and tactile representations through multi-level constraints, enabling prediction of millimetre-scale tactile signals across 1,140 taxels on a robotic hand from RGB images. Manifold analysis reveals that these constraints reshape visual representations into geometry consistent with the tactile manifold, reducing the complexity of cross-modal mapping. Extending this alignment framework to cross-domain observations of human hands enables tactile prediction and reflexive responses to observed human touch. Our results link a neural principle of visuo-tactile resonance to robotic perception, providing an explainable route towards anticipatory touch and empathic human-robot interaction. Code is available at this https URL.
让机器人感受你的触摸:用于具身镜像共振的视觉-触觉皮层对齐 /
Let Robots Feel Your Touch: Visuo-Tactile Cortical Alignment for Embodied Mirror Resonance
1️⃣ 一句话总结
该研究受人类镜像触觉现象启发,提出了一种名为Mirror Touch Net的算法,通过让机器人的视觉与触觉表征在语义、分布和几何上对齐,使其能够仅根据RGB图像预测机械手上1140个触觉点的精细触觉信号,并实现对人类触摸的触觉预测和反射式响应,为构建具有同理心的人机交互提供了可解释的途径。