Sen2Agri final version was just published

The Sen2Agri system was just released in version 2.0. As the project is now terminated and will not receive more funding from ESA, V2_0 is the final version, apart maybe for a few bug fixes.

 

Sen2Agri is a very complete ground segment that automates download and processing of Sentinel-2 data to obtain multi-temporal syntheses, crop/non crop, and crop type masks as well as biophysical variables and phenological indicators. It was designed to be able to process the Sentinel-2 data over whole countries.

 

The Sen2AGri system has been a huge success. I receive an email each time MACCS/MAJA is downloaded, and I just counted 651 downloads after sorting them out the double download,

The new version includes the most recent version of MAJA (3.3) for atmospheric correction and cloud detection (but without using the CAMS data, which are not always easy to access), an early version of WASP to make the composites, an early version of Iota2 to obtain crop types, a processor to obtain the crop/non Crop mask, and a processor to compute biophysical variables inspired by the works of F.Baret and M.Weiss at INRA.

 

The consortium, funded by ESA, was lead by Sophie Bontemps et Pierre Defourny at the Université Catholique de Louvain. The system was developed by CS-SI France and Romania, and most of the methods were designed in CESBIO.

 

 

 

Snow cover duration in the Canadian Rockies from Sentinel-2 observations

Recently I generated one year of snow maps from Sentinel-2 in the Canadian Rockies for a talented colleague who is working on the numerical simulation of the snow cover at high-resolution with an exciting new hydrological model. The area is not covered by Theia, hence I used Start_Maja to generate the L2A products and then LIS to generate the snow masks on the CNES supercomputer (thanks!).

Study area (four Sentinel-2 tiles)

This area is quite challenging for snow optical remote sensing: the terrain is steep and there are a lot of forests. After processing this area, I also found some unexpected issues in the LIS processor, which need to be fixed, like turbid rivers detected as snow, or wildfires smoke detected as snow. I tried to use Pekel's water mask to remove the rivers and lakes pixels but there are some glaciers that are misclassified as water in this product hence I simply masked out areas below 2000 m. This eliminates most of the water surfaces but not all the forests, hence I used Hansen's global forest product to mask out pixels with a tree cover density larger than 50%.

Peter Lougheed Provincial Park


Continue reading

Near real time detection of deforestation in French Guiana

Marie Ballère started in October 2018 a Ph.D. funded by WWF and CNES. The aim of her Ph.D. is to characterize animal habitats in tropical forest using radar and optical data. The first results on near real time forest disturbances assessment using radar Sentinel-1 data in French Guiana were shown at the ESA Living Planet Symposium 2019 in Milano, and they are striking !

 

The near real time forest disturbances detection method used by Marie has been described in Bouvet et al. (2018) and successfully tested over a test site in Peru. Classical methods are based on the hypothesis that the radar backscatter decreases when disturbances occur. However, the backscatter does not necessarily decrease, because rainfalls and/or trees remaining on the ground for example, lead to an increase of radar backscatter.

 

To get around this problem, the method from Bouvet et al. (2018) is based on the detection of radar shadowing. Shadowing occurs in radar images because of the particular side-looking viewing geometry of radar systems. A shadow in a radar image is an area that cannot be reached by any radar pulse. Shadows created by trees at the borders between forest and non-forest areas can be observed in high-resolution radar images (Figure 1), depending on the viewing direction. Shadows that appear are characterized by a sudden drop of backscatter in the radar time series. Thanks to the purely geometrical nature of the shadowing effects, this decrease of backscatter is expected to be persistent over time. New shadows should consequently remain visible for a long time and are easily detectable when dense time series of radar data, such as Sentinel-1 time series, are available.

Figure 1 Illustration of the SAR shadowing effect at the border between forests and deforested areas

 

This method has been tested over various sites in South American, African and Asian tropical forests for three years now and significantly improved. Marie Ballère participated to the improvement of the method, applied it over the whole French Guiana using Sentinel-1 data acquired from 2016 to 2018, and validated the resulting maps. Slashing deforestation (farming method that involves the cutting and burning of trees) detection has been validated using 94 reference data (surface area of 48.2 ha) kindly shared by Pierre Joubert and Eloise Grebic from the Parc Amazonien de Guyane. Producer and user accuracies related to disturbed forests reached 83% and 99% respectively. Gold mining detection has been validated using 36 reference data (surface area of 76 ha), leading to producer and user accuracies of 86% and 99% respectively.

 

In addition, we compared our results with the deforestation patches detected in the University of Maryland (UMD) Global Land Analysis and Discovery (GLAD) Forest Alert dataset (Hansen et al., 2016), a Landsat-based humid tropical forest disturbance alert system over the tropics (http://glad.geog.umd.edu/alerts). Producer accuracies of 24% and 44% were found for slashing deforestation and gold mining respectively. A small area showing the comparison between the CESBIO and UMD-GLAD methods is shown in the maps in Figure 2.

Figure 2 Comparison between the CESBIO and UMD-GLAD near real time forest disturbances detection methods in French Guiana in 2018. Gold mining reference plots are shown in green. Producer accuracies of 86% and 44% were found using the CESBIO and UMD-GLAD respectively

Figure 3 shows the number of disturbed areas detected per month using the CESBIO and UMD-GLAD methods (note that disturbed plots that were not detected using the UMD-GLAD method were not taken into account). Slashing deforestation, occurring mainly during the dry season, was detected timely using both methods. However, because clouds hamper the GLAD optical-based forest disturbances detection during the rainy season, gold mining occurring all year long was detected 72±58 days in advance using the CESBIO method.

Figure 3 Slashing deforestation, occurring mainly during the dry season, was detected timely using CESBIO and UMD-GLAD methods. Gold mining occurring all year long was detected 72±58 days in advance using the CESBIO method

Figure 4 Forest disturbances detection using the CESBIO method in French Guiana from 2016 to 2018

The CESBIO method has been applied over the whole French Guiana for the years 2016, 2017 and 2018 (Figure 4). The deforestation rates were found to be -0.7%, -0.5% and -0.5% respectively, relatively to French Guyana area.

 

A lot of exciting research can now be performed based on these results (e.g. for understanding the causes related to the spatial and temporal evolution of disturbances patterns). In addition, Sentinel-1 and Sentinel-2 data are being currently used by Marie to identify the drivers of deforestation.

References:
  • Bouvet, A., Mermoz, S., Ballère, M., Koleck, T., & Le Toan, T. (2018). Use of the SAR Shadowing Effect for Deforestation Detection with Sentinel-1 Time Series. Remote Sensing, 10(8), 1250.
  • Hansen, M. C., Krylov, A., Tyukavina, A., Potapov, P. V., Turubanova, S., Zutta, B., ... & Moore, R. (2016). Humid tropical forest disturbance alerts using Landsat data. Environmental Research Letters, 11(3), 034008.

Combined exploitation of VENμS, Sentinel-2 and Landsat-8: the spectral bands

=>

The combined use of VENμS, Sentinel-2 and Landsat-8 data can increase the likelihood of obtaining cloud-free images or may allow detailed tracking of rapidly evolving phenomena.

In order to facilitate this combination, the table below summarizes the correspondences between the spectral bands of the instruments. VENμS does not have a spectral band in the middle infrared.

The figure below shows the spectral bands of VENμS and Sentinel-2 in the 400 to 1000 nm range. The SWIR bands of Sentinel-2 are not included.The table below shows the usual band combinations


The figure below makes it possible to assess the degree of similarity of the spectral responses of these usual bands.

The detailed spectral responses of each instrument are available via the following web pages:

VENµS

http://www.cesbio.ups-tlse.fr/multitemp/?page_id=14229

SENTINEL-2

https://earth.esa.int/web/sentinel/user-guides/sentinel-2-msi/document-library/-/asset_publisher/Wk0TKajiISaR/content/sentinel-2a-spectral-responses

LANDSAT

https://landsat.usgs.gov/spectral-characteristics-viewer

https://landsat.usgs.gov/landsat/spectral_viewer/bands/Ball_BA_RSR.xlsx

 

 

 

Exploitation combinée de VENµS, Sentinel-2 and Landsat-8 : les bandes spectrales

=>

L’utilisation combinée des données de VENµS, Sentinel-2 et Landsat-8 peut permettre d’augmenter la probabilité d’obtenir des images sans nuage ou de suivre de manière détaillée des phénomènes à évolution rapide.

Afin de faciliter cette combinaison, le tableau ci-dessous présente de manière résumée les correspondances entre les bandes spectrales des instruments. VENµS ne comporte pas de bande spectrale dans le moyen infrarouge.

La figure ci-dessous présente les bandes spectrales de VENµS et Sentinel-2 dans le domaine 400 à 1000 nm. Les bandes SWIR de Sentinel-2 ne sont incluses.Le tableau ci-dessous présente les combinaisons de bandes usuelles

La figure ci-après permet d'apprécier le degré de similarité des réponses spectrales de ces bandes usuelles.

Les réponses spectrales détaillées de chaque instrument sont disponibles via les pages web suivantes :

VENµS

http://www.cesbio.ups-tlse.fr/multitemp/?page_id=14229

SENTINEL-2

https://earth.esa.int/web/sentinel/user-guides/sentinel-2-msi/document-library/-/asset_publisher/Wk0TKajiISaR/content/sentinel-2a-spectral-responses

LANDSAT

https://landsat.usgs.gov/spectral-characteristics-viewer

https://landsat.usgs.gov/landsat/spectral_viewer/bands/Ball_BA_RSR.xlsx

 

 

 

MAJA 3.3 is available, with a LOT of improvements

What's new ?

Pfew ! It has been quite long, but MAJA 3.3 is available, and it improves a LOT of things !

  • Some bugs have been fixed, like the one which caused detection of cloud or cloud shadows on the edges of the images
  • It seems we have finally solved the bugs that plagued the CAMS option since we released MAJA V3.0. Since V3.0, this option uses the Copernicus Atmosphere aerosol forecasts to set the aerosol type before retrieving the aerosol optical thickness (AOT) from Sentinel-2 data
  • We now also use CAMS AOT as a default value, when it is not possible to estimate AOT using the images, for instance above a snow covered landscape or for small gaps in a large cloud cover. Before that, we used 0.1 everywhere as a default value. The default value is used in the cost function with a very low weight, it has no impact when conditions for AOT estimates are good, but a large impact in bad conditions.
  • The cirrus correction module was over correcting the impact of thick cirrus clouds, providing images with dark clouds. We have limited the correction in order to get more realistic values
  • We have improved the cloud detection, with a better compromise between false positives and false negatives. We also handle better the variation against altitude of the cirrus cloud detection with band 10 (1.38 µm). MAJA 3.3 is the version with which we obtained the results of our recently published article. This paper shows that MAJA has slightly better performances than FMask 4.0, and much better performances than Sen2Cor.

Moreover, when we validated the results, we figured out that one of the parameters in our settings had a wrong value (10 instead of 1). It is easy to make such errors, because there are about 150 parameters in MAJA, and it's easy to make an error. We have set up a version management of MAJA settings since 2017, but the erroneous value was already there before that. And this value has a big impact ! The standard deviation of errors in AOT estimates is reduced by 30 to 40% !!

The W_dark parameter controls the weight of the dark pixel method in the AOT estimation. This method is just supposed to be used as a safeguard in case the multi-temporal or multi-spectral methods provide wrong results. It should therefore have a low weight, but with a weight of 10, it was in fact the method which had the highest weight in our estimates. As this method provides a maximum value of the AOT, it tended to reduce the provide too low AOTs. This improvement is therefore a great piece of news, but it comes with some shame not to have found this error before.

 

AOT validation against AERONET for 10 sites with the wrong W_dark value AOT validation against AERONET for 10 sites with the correct W_dark value

The blue dots correspond to validation obtained in good conditions, while red dots correspond to less reliable validation points.

These are the results of comparison of version 3.3 with CAMS activated, changing only the W_dark parameter, but if we compare with the results of version 3.1, the improvement is even more impressive :

AOT validation against AERONET for 10 sites, version 3.1/td> AOT validation against AERONET for 10 sites version 3.3, with the correct W_dark value

How to access MAJA ?

Here is how to access MAJA 3.3 :

  • MAJA 3.3 is distributed as a free software for non commercial purposes from CNES free software site (select the 3.3 TM version in the download tab). If you need it for commercial purposes, you just have to ask me for a different licence, but it still will be free of charge.
  • The best way to use MAJA is to run it with Start_MAJA, which is a simple python code that runs MAJA for a whole time series for a given Sentinel-2 tile. The Strart MAJA readme also explains how to get the good settings, with the good W_dark value, how to prepare the DEM or how to get CAMS data.
  • PEPS on-demand processing facility will be updated soon, but it is still working with MAJA 3.2 so far
  • THEIA is also running MAJA 3.2. We will update first the wdark parameter, and then MAJA 3.3 and then start production with CAMS, hopefully before summer. If everything goes well, we will then start a reprocessing of all our data set. So, stay tuned on this information channel.