Multi-Sensor Data and Image Fusion for Environmental Parameter Prediction with Uncertainty-Aware Deep Learning

Authors

  • Guosen Ma

DOI:

https://doi.org/10.54691/pc86v762

Keywords:

Multi-sensor fusion, environmental monitoring, uncertainty quantification, adaptive gated fusion, conformal prediction, deep learning.

Abstract

Accurate environmental parameter prediction is essential for air quality management, precision agriculture, water resource protection, and urban climate adaptation. Traditional approaches rely on single-modality sensor data and point estimation, neglecting the complementary information available from remote sensing imagery and failing to quantify prediction uncertainty. This paper proposes MSAF-UQ, a Multi-Sensor Adaptive Fusion framework with Uncertainty Quantification for environmental parameter prediction. Three key innovations are introduced: (1) a Multi-Sensor Adaptive Fusion mechanism (MSAF) that employs self-attention alignment, cross-attention interaction, and adaptive gating to dynamically fuse heterogeneous sensor tabular data with satellite or UAV imagery, overcoming fixed-weight concatenation bottlenecks; (2) an Environment-Aware Uncertainty Decomposition (EAUD) framework that unifies Monte Carlo Dropout, deep ensemble learning, and conformal prediction to decompose total predictive uncertainty into epistemic, aleatoric, and distributional components, with a theoretically derived joint coverage probability lower bound; and (3) a Sensor-Weighted Adaptive Loss (SWAL) function that dynamically adjusts gradient contributions based on sample-level uncertainty and sensor reliability, enhancing learning on high-uncertainty samples. Systematic experiments on four cross-domain environmental benchmarks (Air Quality, Soil Moisture, Water Quality, and Urban Microclimate) demonstrate that MSAF-UQ achieves 23.1% and 25.8% improvements in RMSE and MAE, respectively, attains a PICP of 96.2% with 23.5% MPIW reduction, and exhibits strong cross-domain transferability with statistical significance (Wilcoxon rank-sum test, p < 0.01).

Downloads

Download data is not yet available.

References

[1] Nguyen T X B, Rosser K, Chahl J. A review of modern thermal imaging sensor technology and applications for autonomous aerial navigation[J]. Journal of Imaging, 2021, 7(10): 217.

[2] Longmore S N, Collins R P, Pfeifer S, et al. Adapting astronomical source detection software to help detect animals in thermal images obtained by unmanned aerial systems[J]. International Journal of Remote Sensing, 2017, 38(8-10): 2623-2638.

[3] Qin H, Lu J, Feng J, et al. Multi-Sensor Data Fusion and Deep Feature-Based Classification for Mapping China's Coastal and Wetland Areas[J]. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 2025.

[4] Shao Z, Wang H, Cai Y, et al. UA-Fusion: Uncertainty-aware multimodal data fusion framework for 3D object detection of autonomous vehicles[J]. IEEE Transactions on Instrumentation and Measurement, 2025.

[5] Roy S, Saha S. Uncertainty quantification in deep neural networks for multi-sensor Earth observation[M]//Deep Learning for Multi-Sensor Earth Observation. Elsevier, 2025: 231-247.

[6] Ren H, Zhang W, Shi S, et al. UniSense: Spatial-Uncertainty-Aware Collaborative Sensing for Autonomous Driving[C]//Proceedings of the 23rd Annual International Conference on Mobile Systems, Applications and Services. 2025: 459-472.

[7] Li Z, Singh B S B. Autonomous Driving in Adverse Weather: A Multi-Modal Fusion Framework with Uncertainty-Aware Learning for Robust Obstacle Detection[J]. International Journal of Advanced Computer Science & Applications, 2025, 16(8).

[8] Yang Q, Zhao Y, Cheng H. Uncertainty-Aware Evidential Fusion for Multi-Modal Object Detection in Autonomous Driving[J]. Drones, 2026, 10(2): 130.

Downloads

Published

2026-03-31

Issue

Section

Articles

How to Cite

Ma, Guosen. 2026. “Multi-Sensor Data and Image Fusion for Environmental Parameter Prediction With Uncertainty-Aware Deep Learning”. Scientific Journal of Intelligent Systems Research 8 (3): 1-14. https://doi.org/10.54691/pc86v762.