Skip to main content

New Software Boosts Accuracy of Tech to Measure Crop Health

An interdisciplinary team of researchers has developed a new tool that improves the accuracy of electronic devices that measure the color of a plant’s leaves to assess health. The new technology works by improving a sensor’s ability to account for variations in light that can influence how the sensor perceives color.

“There is a tremendous amount of research being done that focuses on developing new plant varieties that are better able to withstand challenges such as drought, high temperatures and so on,” says Michael Kudenov, co-author of a paper on the new software and a professor of electrical and computer engineering at North Carolina State University. “Many of these researchers use sensors that capture the color of a plant’s leaves to assess plant health, which is critical for their work. These sensors are also used by some growers and crop consultants to assess crop health. However, when researchers, growers or crop consultants are working with crops in fields, the sunlight can affect the ability of these sensors to capture leaf color accurately. Specifically, glare can throw the sensors off.

“Our goal was to develop software that would allow users to more easily account for the ways in which glare from sunlight can change the ways that sensors capture the color of a plant’s leaves,” Kudenov says. “Previous tools that account for glare have been extremely complex and required a lot of computational power. Our approach is substantially less complicated.”

A key idea to understand here is polarization. If we think of light as a wave, it’s possible for its wavelengths to vibrate along many different planes. When light is polarized, that means that the light is vibrating on the same plane. If you’ve tried to look into a body of water on a bright day, you’ve probably noticed that glare from the sun can make it difficult to see below the surface of the water. If you put on a pair of polarized sunglasses, the glare effectively disappears, allowing you to see below the surface of the water.

“The software we’ve developed essentially acts like an incredibly dynamic pair of polarized sunglasses, able to account for whatever polarization challenges are present in order to accurately capture the color of a leaf, regardless of the glare,” says Daniel Krafft, first author of the paper and a Ph.D. student at NC State.

Here’s how the new tool works. When the sensors take a picture of a leaf, they not only capture color, but are also able to measure how polarized the light is. The new software estimates the true color of a leaf based on two variables: the color that the sensor perceived and how polarized the darkest wavelength of light is in the picture.

To assess the new tool, researchers conducted proof-of-concept testing that compared the performance of sensors with and without the new software when measuring leaves for which they knew the correct color. They found that the new software performed exceptionally well.

“The new software reduced the magnitude of errors tenfold when there was a lot of glare,” Kudenov says. “For example, if the color recorded by a sensor with the new software was off by 3%, the color recorded by a sensor without the software was off by 30%. And when there’s not a lot of glare, then you don’t need the software to do as much, so the difference between the two sensors was less pronounced.”

The researchers tested the new software using a full-size hyperspectral polarization camera. Next steps include incorporating the new software into more compact visual sensors and testing it on platforms such as drones to see how it performs in real-world situations with a variety of crops.

“Ultimately, we’d like to provide researchers and growers with a tool that is small enough and inexpensive enough for practical use,” Kudenov says.

The paper, “Mitigating Illumination-, Leaf-, and View-Angle Dependencies in Hyperspectral Imaging Using Polarimetry,” is published in the open-access journal Plant Phenomics. The paper was co-authored by Clifton Scarboro, former Ph.D. student at NC State; William Hsieh, a former undergraduate at NC State; Colleen Doherty, an associate professor of molecular and structural biochemistry at NC State; and Peter Balint-Kurti, a USDA-ARS research geneticist and adjunct professor of plant pathology at NC State.

The research was done with support from the National Science Foundation, under grant number 1809753; and the National Institute of Food and Agriculture, under grant number 2020-67021-31961.

-shipman-

Note to Editors: The study abstract follows.

“Mitigating Illumination-, Leaf-, and View-Angle Dependencies in Hyperspectral Imaging Using Polarimetry”

Authors: Daniel Krafft, Clifton G. Scarboro, William Hsieh, Colleen Doherty, Peter Balint-Kurti and Michael Kudenov, North Carolina State University

Published: March 22, Plant Phenomics

DOI: 10.34133/plantphenomics.0157

Abstract: Automation of plant phenotyping using data from high-dimensional imaging sensors is on the forefront of agricultural research for its potential to improve seasonal yield by monitoring crop health and accelerating breeding programs. A common challenge when capturing images in the field relates to the spectral reflection of sunlight (glare) from crop leaves which, at certain solar incidences and sensor viewing angles, presents unwanted signals. The research presented here involves the convergence of two parallel projects to develop an algorithm which can use polarization data to decouple light reflected from the surface of the leaves and light scattered from the leaf’s tissue. The first project is a mast-mounted hyperspectral imaging polarimeter (HIP) that can image a maize field across multiple diurnal cycles throughout a growing season. The second project is a multistatic fiber-based (MFB) Mueller matrix bidirectional reflectance distribution function (mmBRDF) instrument which measures the polarized light-scattering behavior of individual maize leaves. This data was fitted to an existing model using SCATMECH, which outputs parameters that were used to run Monte Carlo simulations. The simulated data were then used to train a shallow neural network which works by comparing unpolarized two-band vegetation index (VI) with linearly polarized data from the low-reflectivity bands of the VI. Using GNDVI and Red-edge Reflection Ratio (RERR) we saw an improvement of an order of magnitude or more in the mean error (ϵ) and a reduction spanning 1.5 to 2.7 in their standard deviation (ϵσ) after applying the correction network on the HIP sensor data.

This post was originally published in NC State News.