Data Fusion of Sentinel and LiDAR for Soil Texture Prediction
Abstract
Soil texture is a critical parameter influencing agricultural productivity, water retention, and nutrient availability. Accurate mapping of soil texture enhances precision agriculture and sustainable land management. This study explores the integration of Sentinel-1 (S-1) synthetic aperture radar (SAR), Sentinel-2 (S-2) multispectral imagery, and Light Detection and Ranging (LiDAR) data to predict soil texture across a 3000 km² semi-arid agricultural region in central Tunisia. Using machine learning algorithms, specifically Random Forest (RF) and Support Vector Machine (SVM), we fused multi-source remote sensing data to estimate clay, silt, and sand fractions. Field samples (n=150) with clay content ranging from 13% to 60% were used for training and validation. Results indicate that the fused dataset achieved a classification accuracy of 85% (RF) and 82% (SVM), with a root mean square error (RMSE) of 5.2% for clay content prediction. The integration of LiDAR-derived topographic features with Sentinel data significantly improved prediction accuracy compared to single-sensor approaches. This study highlights the potential of data fusion for high-resolution soil texture mapping, offering valuable insights for precision agriculture.
How to Cite This Article
Dr. Anastasios Georgiou, Dr. Seyed Mahdi Hosseini (2024). Data Fusion of Sentinel and LiDAR for Soil Texture Prediction . Journal of Soil Future Research (JSFR), 5(2), 19-22 .