Earth observation satellites are used for many tasks, among them the monitoring of agricultural areas. For example subsidence farming can be monitored to predict food shortages, which in turn helps organizing humanitarian aid faster when there is a shortage. Often, optical data is used because of the easy interpretability and especially the normalized difference vegetation index (NDVI) is frequently used for vegetation monitoring. However, in tropical areas with frequent cloud coverage or subtropical areas where the main growing season is during the rainy season clouds hinder the acquisition of optical images. To avoid this, active cloud-penetrating sensors like synthetic aperture radar (SAR) can be used. However, the greatly different characteristics of SAR images make interpretation more difficult and usable intelligence is harder to obtain.
There is great demand to mitigate this problem by converting SAR backscatter values to artificial NDVI values, which then in turn can be used for downstream tasks. This idea was already demonstrated in two studies [1, 2]. However, these studies are limited to small areas and present conversion models that are not globally applicable. Additionally, they suffer from a low performance when only relying on backscatter values and not using additionally data sources like the last cloud free NDVI value.
As solution we present a globally applicable model for the conversion of SAR backscatter values to NDVI values using a deep neural network. The used model does not rely on optical data at application time and is therefore unaffected by cloud cover.
To train the model, a dataset consisting of Sentinel-1 SAR data and Sentinel-2 optical data is created. To have a direct relation between backscatter and NDVI values the temporal distance is at most 12 hours between images of the same area. This avoids other influences like seasonal changes or vegetation growth. Images were sampled globally with an equal distribution for climate zones and land covers to capture the full spectrum of earth surfaces and vegetation. As auxiliary data, the 10m resolution ESA WorldCover product [3] and the 30m resolution ALOS JAXA DEM [4] were retrieved. Google Earth Engine was used to download the data.
The used model is a slightly adapted UNet. It does a pixel-wise regression of the NDVI using the VV and VH polarizations of the Sentinel-1 data, the ESA WorldCover and the ALOS DEM.
Using this approach, a globally applicable model is created to predict the NDVI from cloud-penetrating SAR images. This removes the need to train models for specific regions and vegetation. One disadvantage of this approach is the lower resolution of Sentinel-1 images (with a pixel size of 20x22m [5]) compared to the 10x10m resolution of Sentinel-2 images. This prevents the correct prediction of some fine spatial details. Further research is needed to increase the resolution regarding those details, either by using time series as input instead of images of a single date or by using other data sources to include more structural details.
ACKNOWLEDGMENT
This work was supported by the German Federal Ministry for Economic Affairs and Energy in the project “DESTSAM - Dense Satellite Time Series for Agricultural Monitoring” (FKZ 50EE2018A).
REFERENCES
[1] G. Scarpa, M. Gargiulo, A. Mazza, and R. Gaetano, “A CNN-based fusion method for feature extractionfrom sentinel data,” Remote Sensing, vol. 10, no. 2, Art no. 236, 2018.
[2] R. Filgueiras, E. C. Mantovani, D. Althoff, E. I. F. Filho, and F. F. da Cunha, “Crop NDVI monitoring based on Sentinel 1,” Remote Sensing, vol. 11, no. 12, Art no. 1441, 2019.
[3] D. Zanaga, R. Van De Kerchove, W. De Keersmaecker, N. Souverijns, C. Brockmann, R. Quast, J. Wevers, A. Grosu, A. Paccini, S. Vergnaud, O. Cartus, M. Santoro, S. Fritz, I. Georgieva, M. Lesiv, S. Carter, M. Herold, Linlin Li, N. E. Tsendbazar, F. Ramoino, O. Arino, ”ESA WorldCover 10 m 2020 v100,” 2021.
[4] J. Takaku, T. Tadono, M. Doutsu, F. Ohgushi, and H. Kai, “Updates of ‘AW3D30’ ALOS Global Digital Surface Model with Other Open Access Datasets”, The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, ISPRS, vol. XLIII-B4-2020, pp. 183–189, 2020.
[5] Collecte Localisation Satellites, ”Sentinel-1 product definition,” Online, 2015.
«