rm(list=ls())
8 Remote sensing
8.1 Introduction
Remote sensing techniques, in the sense of gathering & processing of data by a device separated from the object under study, are increasingly providing an important component of the set of technologies available for the study of vegetation systems and their functioning. This is in spite that many applications only provide indirect estimations of the biophysical variables of interest (Jones and Vaughan 2010).
Particular advantages of remote sensing for vegetation studies are that: (i) it is non-contact and non-destructive; and (ii) observations are easily extrapolated to larger scales. Even at the plant scale, remotely sensed imagery is advantageous as it allows rapid sampling of large number of plants (Jones and Vaughan 2010).
This chapter aims at providing a conceptual & practical approach to apply remote sensing data and techniques to infer information useful for monitoring crop diseases. The structure of this chapter is divided into four sections. The first one introduces basic remote sensing concepts and provides a summary of applications of remote sensing of crop diseases. The second one illustrates a case study focused on identification of banana Fusarium wilt from multispectral UAV imagery. The third one illustrates a case study dealing with estimation of cercospora leaf spot disease on table beet. Finally, it concludes with several reflections about potential and limitations of this technology.
8.2 Remote sensing background
8.2.1 Optical remote sensing
Optical remote sensing makes use of the radiation reflected by a surface in the visible (~400-700 nm), the near infrared (700-1300 nm) and shortwave infrared (1300-~3000 nm) parts of the electromagnetic spectrum. Spaceborne & airborne-based remote sensing and field spectroscopy utilize the solar radiation as an illumination source. Lab spectroscopy utilizes a lamp as an artificial illumination source Figure 8.1.

The proportion of the radiation reflected by a surface depends on the surface’s spectral reflection, absorption and transmission properties and varies with wavelength Figure 8.2. These spectral properties in turn depend on the surface’s physical and chemical constituents Figure 8.2. Measuring the reflected radiation hence allows us to draw conclusions on a surface’s characteristic, which is the basic principle behind optical remote sensing.

8.2.2 Vegetation spectral properties
Optical remote sensing enables the deduction of various vegetation-related characteristics, including biochemical properties (e.g., pigments, water content), structural properties (e.g., leaf area index (LAI), biomass) or process properties (e.g., light use efficiency (LUE)). The ability to deduce these characteristics depends on the ability of a sensor to resolve vegetation spectra. Hyperspectral sensors capture spectral information in hundreds of narrow and contiguous bands in the VIS, NIR and SWIR, and, thus, resolve subtle absorption features caused by specific vegetation constituents (e.g. anthocyanins, carotenoids, lignin, cellulose, proteins). In contrast, multispectral sensors capture spectral information in a few broad spectral bands and, thus, only resolve broader spectral features. Still, multispectral systems like Sentinel-2 have been demonstrated to be useful to derive valuable vegetation properties (e.g., LAI, chlorophyll).

8.2.3 What measures a remote sensor?
Optical sensors/spectrometers measure the radiation reflected by a surface to a certain solid angle in the physical quantity radiance. The unit of radiance is watts per square meter per steradian (W • m-2 • sr-1) Figure 8.4. In other words, radiance describes the amount of energy (W) that is reflected from a surface (m-2) and arrives at the sensor in a three-dimensional angle (sr-1).

A general problem related to the use of radiance as unit of measurement is the variation of radiance values with illumination. For example, the absolute incoming solar radiation varies over the course of the day as a function of the relative position between sun and surface and so does the absolute amount of radiance measured. We can only compare measurements taken a few hours apart or on different dates when we are putting the measured radiance in relation to the incoming illumination.
The quotient between measured reflected radiance and measured incoming radiance (Radiancereflected / Radianceincoming) is called reflectance (usually denoted as \(\rho\)). Reflectance provides a stable unit of measurement which is independent from illumination and is the percentage of the total measurable radiation, which has not been absorbed or transmitted.
8.2.4 Hyperspectral vs.multispectral imagery
Hyperspectral imaging involves capturing and analyzing data from a large number of narrow, contiguous bands across the electromagnetic spectrum, resulting in a high-resolution spectrum for each pixel in the image. As a result, a hyperspectral camera provides smooth spectra. The spectra provided by multispectral cameras are more like stairs or saw teeth without the ability to depict acute spectral signatures Figure 8.6.
8.2.5 Vegetation Indices
A vegetation index (VI) represents a spectral transformation of two or more bands of spectral imagery into a singleband image. A VI is designed to enhance the vegetation signal with regard to different vegetation properties, while minimizing confounding factors such as soil background reflectance, directional, or atmospheric effects. There are many different VIs, including multispectral broadband indices as well as hyperspectral narrowband indices.
Most of the multispectral broadband indices make use of the inverse relationship between the lower reflectance in the red (through chlorophyll absorption) and higher reflectance in the near-infrared (through leaf structure) to provide a measure of greenness that can be indirectly related to biochemical or structural vegetation properties (e.g., chlorophyll content, LAI). The Normalized Difference Vegetation Index (NDVI) is one of the most commonly used broadband VIs:
\[NDVI = \frac{\rho_{nir} - \rho_{red} }{\rho_{nir} + \rho_{red}}\]
The interpretation of the absolute value of the NDVI is highly informative, as it allows the immediate recognition of the areas of the farm or field that have problems. The NDVI is a simple index to interpret: its values vary between -1 and 1, and each value corresponds to a different agronomic situation, regardless of the crop Figure 8.5

8.3 Remote sensing of crop diseases
8.3.1 Detection of plant stress
One popular use of remote sensing is in diagnosis and monitoring of plant responses to biotic (i.e. disease and insect damage) and abiotic stress (e.g. water stress, heat, high light, pollutants) with hundreds of publications on the topic. It is worth nothing that most available techniques monitor the plant response rather than the stress itself. For example, with some diseases, it is common to estimate changes in canopy cover (using vegetation indices) as measures of “disease” but this measure could also be associated to water deficit (Jones and Vaughan 2010). This highlights the importance of measuring crop conditions in the field & laboratory to collect reliable data and be able to disentangle complex plant responses. Anyway, remote sensing can be used as the first step in site-specific disease control and also to phenotype the reactions of plant genotypes to pathogen attack (Lowe et al. 2017).
8.3.2 Optical methods for measuring crop disease
There are a variety of optical sensors for the assessment of plant diseases. Sensors can be based only on the visible spectrum (400-700 nm) or on the visible and/or infrared spectrum (700 nm - 1mm). The latter may include near-infrared (NIR) (0.75-1.4 \(μm\)), short wavelength infrared (SWIR) (1.4–3 \(μm\)), medium wavelength infrared (MWIR) (3-8 \(μm\)), or thermal infrared (8-15 \(μm\)) Figure 8.6. Sensors record either imaging or non imaging (i.e average) spectral radiance values which need to be converted to reflectance before conducting any crop disease monitoring task.

In a recent chapter of Agrio’s Plant Pathology, Del Ponte et al. (2024) highlights the importance of understanding the basic principles of the interaction of light with plant tissue or the plant canopy as a crucial prerrequisite for the analysis and interpretation for disease assessment. When a plant is infected, there are changes to the phisiology and biochemistry of the host, with the eventual development of disease symptoms and/or signs of the pathogen which may be accompanied by structural and biochemical changes that affect absorbance, transmittance, and reflectance of light Figure 8.8.

8.3.3 Scopes of disease sensing
The quantification of typical disease symptoms (disease severity) and assessment of leaves infected by several pathogens are relatively simple for imaging systems but may become a challenge for nonimaging sensors and sensors with inadequate spatial resolution (Oerke 2020). Systematic monitoring of a crop by remote sensors can allow farmers to take preventive actions if infections are detected early. Remote sensing sensors & processing techniques need to be carefully selected to be capable of (a) detecting a deviation in the crop’s health status brought about by pathogens, (b) identifying the disease, and (c) quantifying the severity of the disease. Remote sensing can also be effectively used in (d) food quality control Figure 8.8.

8.3.4 Monitoring plant diseases
Sensing of plants for precision disease control is done in large fields or greenhouses where the aim is to detect the occurrence of diseases at the early stages of epidemics, i.e., at low symptom frequency. Lowe et al. (2017) reviewed hyperspectral imaging of plant diseases, focusing on early detection of diseases for crop monitoring. They report several analysis techniques successfully used for the detection of biotic and abiotic stresses with reported levels of accuracy higher than 80%.
Technique | Plant (stress) |
---|---|
Quadratic discriminant analysis (QDA) | Wheat (yellow rust) |
Avacado (laurel wilt) | |
Decision tree (DT) | Avacado (laurel wilt) |
Sugarbeet (cerospora leaf spot) | |
Sugarbeet (powdery mildew) | |
Sugarbeet (leaf rust) | |
Multilayer perceptron (MLP) | Wheat (yellow rust) |
Partial least square regression (PLSR) | Celery (sclerotinia rot) |
Raw | |
Savitsky-Golay 1st derivative | |
Savitsky-Golay 2nd derivative | |
Partial least square regression (PLSR) | Wheat (yellow rust) |
Fishers linear determinant analysis | Wheat (aphid) |
Wheat (powdery mildew) | |
Wheat (powdery mildew) | |
Erosion and dilation | Cucumber (downey mildew) |
Spectral angle mapper (SAM) | Sugarbeet (cerospora leaf spot) |
Sugarbeet (powdery mildew) | |
Sugarbeet (leaf rust) | |
Wheat (head blight) | |
Artificial neural network (ANN) | Sugarbeet (cerospora leaf spot) |
Sugarbeet (powdery mildew) | |
Sugarbeet (leaf rust) | |
Support vector machine (SVM) | Sugarbeet (cerospora leaf spot) |
Sugarbeet (powdery mildew) | |
Sugarbeet (leaf rust) | |
Barley (drought) | |
Spectral information divergence (SID) | Grapefruit |
(canker, greasy spot, insect | |
damage, scab, wind scar) |
Lowe et al. (2017) state that remote sensing of diseases under production conditions is challenging because of variable environmental factors and crop-intrinsic characteristics, e.g., 3D architecture, various growth stages, variety of diseases that may occur simultaneously, and the high sensitivity required to reliably perceive low disease levels suitable for decision-making in disease control. The use of less sensitive systems may be restricted to the assessment of crop damage and yield losses due to diseases.
8.3.5 UAV applications for plant disease detection and monitoring
Kouadio et al. (2023) undertook a systematic quantitative literature review to summarize existing literature in UAV-based applications for plant disease detection and monitoring. Results reveal a global disparity in research on the topic, with Asian countries being the top contributing countries. World regions such as Oceania and Africa exhibit comparatively lesser representation. To date, research has largely focused on diseases affecting wheat, sugar beet, potato, maize, and grapevine Figure 8.9. Multispectral, red-green-blue, and hyperspectral sensors were most often used to detect and identify disease symptoms, with current trends pointing to approaches integrating multiple sensors and the use of machine learning and deep learning techniques. The authors suggest that future research should prioritize (i) development of cost-effective and user-friendly UAVs, (ii) integration with emerging agricultural technologies, (iii) improved data acquisition and processing efficiency (iv) diverse testing scenarios, and (v) ethical considerations through proper regulations.

8.4 Disease detection
This section illustrates the use of unmanned aerial vehicle (UAV) remote sensing imagery for identifying banana wilt disease. Fusarium wilt of banana, also known as “banana cancer”, threatens banana production areas worldwide. Timely and accurate identification of Fusarium wilt disease is crucial for effective disease control and optimizing agricultural planting structure (Pegg et al. 2019).
A common initial symptom of this disease is the appearance of a faint pale yellow streak at the base of the petiole of the oldest leaf. This is followed by leaf chlorosis which progresses from lower to upper leaves, wilting of leaves and longitudinal splitting of their bases. Pseudostem splitting of leaf bases is more common in young, rapidly growing plants [Pegg et al. (2019)]Figure 8.10.

Ye et al. (2020) made publicly available experimental data (Huichun YE et al. 2022) on wilted banana plants collected in a banana plantation located in Long’an County, Guangxi (China). The data set includes UAV multispectral reflectance data and ground survey data on the incidence of banana wilt disease. The paper by Ye et al. (2020) reports that the banana Fusarium wilt disease can be easily identified using several vegetation indices (VIs) obtained from this data set. Tested VIs include green chlorophyll index (CIgreen), red-edge chlorophyll index (CIRE), normalized difference vegetation index (NDVI), and normalized difference red-edge index (NDRE). The dataset can be downloaded from here.
8.4.1 Software setup
Let’s start by cleaning up R memory:
Then, we need to install several packages (if they are not installed yet):
<- c("terra",
list.of.packages "tidyterra",
"stars",
"sf",
"leaflet",
"leafem",
"dplyr",
"ggplot2",
"tidymodels")
<- list.of.packages[!(list.of.packages %in% installed.packages()[,"Package"])]
new.packages if(length(new.packages)) install.packages(new.packages)
Now, let’s load all the required packages:
library(terra)
library(tidyterra)
library(stars)
library(sf)
library(leaflet)
library(leafem)
library(dplyr)
library(ggplot2)
library(tidymodels)
8.4.2 Reading the dataset
Next code supposes you have already downloaded the Huichun YE et al. (2022) dataset and unzipped its content under the data/banana_data directory.
8.4.3 File formats
Let’s list the files under each subfolder:
list.files("data/banana_data/1_UAV multispectral reflectance")
[1] "UAV multispectral reflectance.tfw"
[2] "UAV multispectral reflectance.tif"
[3] "UAV multispectral reflectance.tif.aux.xml"
[4] "UAV multispectral reflectance.tif.ovr"
Note that the .tif file contains an orthophotomosaic of surface reflectance. It was created from UAV images taken with a Micasense Red Edge M camera which has five narrow spectral bands: Blue (465–485 nm), green (550–570 nm), red (653–673 nm), red edge (712–722 nm), and near-infrared (800–880 nm). We assume here that those images have been radiometrically and geometrically corrected.
list.files("data/banana_data/2_Ground survey data of banana Fusarium wilt")
[1] "Ground_survey_data_of_banana_Fusarium_wilt.dbf"
[2] "Ground_survey_data_of_banana_Fusarium_wilt.prj"
[3] "Ground_survey_data_of_banana_Fusarium_wilt.sbn"
[4] "Ground_survey_data_of_banana_Fusarium_wilt.sbx"
[5] "Ground_survey_data_of_banana_Fusarium_wilt.shp"
[6] "Ground_survey_data_of_banana_Fusarium_wilt.shp.xml"
[7] "Ground_survey_data_of_banana_Fusarium_wilt.shx"
This is shapefile with 80 points where the plant health status was collected in same date as the images.
list.files("data/banana_data/3_Boundary of banana planting region")
[1] "Boundary_of_banana_planting_region.dbf"
[2] "Boundary_of_banana_planting_region.prj"
[3] "Boundary_of_banana_planting_region.sbn"
[4] "Boundary_of_banana_planting_region.sbx"
[5] "Boundary_of_banana_planting_region.shp"
[6] "Boundary_of_banana_planting_region.shp.xml"
[7] "Boundary_of_banana_planting_region.shx"
This is a shapefile with one polygon representing the boundary of the study area.
8.4.4 Read the orthomosaic and the ground data
Now, let’s read the orthomosaic using the terra package:
# Open the tif
<- "data/banana_data/1_UAV multispectral reflectance/UAV multispectral reflectance.tif"
tif
<- terra::rast(tif) rrr
Let’s check what we get:
rrr
class : SpatRaster
dimensions : 7885, 14420, 5 (nrow, ncol, nlyr)
resolution : 0.08, 0.08 (x, y)
extent : 779257.9, 780411.5, 2560496, 2561127 (xmin, xmax, ymin, ymax)
coord. ref. : WGS 84 / UTM zone 48N (EPSG:32648)
source : UAV multispectral reflectance.tif
names : UAV mu~ance_1, UAV mu~ance_2, UAV mu~ance_3, UAV mu~ance_4, UAV mu~ance_5
min values : 0.000000, 0.000000, 0.000000, 0.0000000, 0.000000
max values : 1.272638, 1.119109, 1.075701, 0.9651694, 1.069767
Note that this is a 5-band multispectral image with 8 cm pixel size.
Now, let’s read the ground data:
<- "data/banana_data/2_Ground survey data of banana Fusarium wilt/Ground_survey_data_of_banana_Fusarium_wilt.shp"
shp <- sf::st_read(shp) ggg
Reading layer `Ground_survey_data_of_banana_Fusarium_wilt' from data source
`/Users/emersondelponte/Documents/GitHub/epidemiology-R/data/banana_data/2_Ground survey data of banana Fusarium wilt/Ground_survey_data_of_banana_Fusarium_wilt.shp'
using driver `ESRI Shapefile'
Simple feature collection with 80 features and 4 fields
Geometry type: POINT
Dimension: XY
Bounding box: xmin: 779548.9 ymin: 2560702 xmax: 780097 ymax: 2561020
Projected CRS: WGS 84 / UTM zone 48N
What we got?
ggg
Simple feature collection with 80 features and 4 fields
Geometry type: POINT
Dimension: XY
Bounding box: xmin: 779548.9 ymin: 2560702 xmax: 780097 ymax: 2561020
Projected CRS: WGS 84 / UTM zone 48N
First 10 features:
OBJECTID 样点类型 x_经度 y_纬度 geometry
1 1 健康植株 107.7326 23.13240 POINT (779838.5 2560800)
2 2 健康植株 107.7332 23.13316 POINT (779901.2 2560885)
3 3 健康植株 107.7334 23.13394 POINT (779920.1 2560971)
4 4 健康植株 107.7326 23.13430 POINT (779837.5 2561010)
5 5 健康植株 107.7302 23.13225 POINT (779595.2 2560779)
6 6 健康植株 107.7301 23.13190 POINT (779584.6 2560739)
7 7 健康植株 107.7300 23.13297 POINT (779569.6 2560857)
8 8 健康植株 107.7315 23.13301 POINT (779729.4 2560865)
9 9 健康植株 107.7313 23.13245 POINT (779710.5 2560803)
10 10 健康植株 107.7349 23.13307 POINT (780078.9 2560879)
Note that the attributes are in Chinese language. It seems that we will need to do several changes.
8.4.5 Visualizing the data
As the orthomosaic is too heavy to visualize, we will need a coarser version of it. Let’s use the terra package for doing it.
<- terra::aggregate(rrr, 8) rrr8
|---------|---------|---------|---------|
=========================================
#terra <- resample(elev, template, method='bilinear')
Let’s check the output:
rrr8
class : SpatRaster
dimensions : 986, 1803, 5 (nrow, ncol, nlyr)
resolution : 0.64, 0.64 (x, y)
extent : 779257.9, 780411.8, 2560496, 2561127 (xmin, xmax, ymin, ymax)
coord. ref. : WGS 84 / UTM zone 48N (EPSG:32648)
source(s) : memory
names : UAV mu~ance_1, UAV mu~ance_2, UAV mu~ance_3, UAV mu~ance_4, UAV mu~ance_5
min values : 0.000000, 0.000000, 0.000000, 0.000000, 0.000000
max values : 1.272638, 1.119109, 1.075701, 0.949925, 1.069767
Note that the pixel size of the aggregated raster is 64 cm. Now, in order to visualize the ground points, we will need a color palette:
<- colorFactor(
pal palette = c('green', 'red'),
domain = ggg$样点类型
)
Then, we will use the leaflet package to plot the new image and the ground points:
leaflet(data = ggg) |>
addProviderTiles("Esri.WorldImagery") |>
addRasterImage(rrr8) |>
addCircleMarkers(~x_经度, ~y_纬度,
radius = 5,
label = ~样点类型,
fillColor = ~pal(样点类型),
fillOpacity = 1,
stroke = F)