菜单

关于 🐙 GitHub
arXiv 提交日期: 2026-04-20
📄 Abstract - Implicit neural representations as a coordinate-based framework for continuous environmental field reconstruction from sparse ecological observations

Reconstructing continuous environmental fields from sparse and irregular observations remains a central challenge in environmental modelling and biodiversity informatics. Many ecological datasets are heterogeneous in space and time, making grid-based approaches difficult to scale or generalise across domains. Here, we evaluate implicit neural representations (INRs) as a coordinate-based modelling framework for learning continuous spatial and spatio-temporal fields directly from coordinate inputs. We analyse their behaviour across three representative modelling scenarios: species distribution reconstruction, phenological dynamics, and morphological segmentation derived from open biodiversity data. Beyond predictive performance, we examine interpolation behaviour, spatial coherence, and computational characteristics relevant for environmental modelling workflows, including scalability, resolution-independent querying, and architectural inductive bias. Results show that neural fields provide stable continuous representations with predictable computational cost, complementing classical smoothers and tree-based approaches. These findings position coordinate-based neural fields as a flexible representation layer that can be integrated into environmental modelling pipelines and exploratory analysis frameworks for large, irregularly sampled datasets.

顶级标签: machine learning environmental science
详细标签: implicit neural representations environmental field reconstruction sparse observations coordinate-based modeling biodiversity 或 搜索:

隐式神经表示:基于坐标的框架,用于从稀疏生态观测中重建连续环境场 / Implicit neural representations as a coordinate-based framework for continuous environmental field reconstruction from sparse ecological observations


1️⃣ 一句话总结

本文提出利用隐式神经网络(神经场)直接从坐标学习连续空间和时空环境场,能够高效处理稀疏、不规则的生态数据,并在物种分布、物候动态和形态分割等任务中表现出稳定的预测能力和良好的计算可扩展性。

源自 arXiv: 2604.18083