Advances in hyperspectral cameras, where light captured by the camera is divided into narrow spectral bands to generate a spectral fingerprint, are making their way into precision-agriculture applications. This information can be used to determine specific information about a plant, including its water content or the presence of diseases. Technology innovation company imec presented its latest snapshot image capturing technology at the EE Times Green Engineering Summit. With this technology, imec’s hyperspectral video cameras can deliver a broad range of spectral bands, including visible, near infrared (NIR), and shortwave IR.
The keynote, Hyperspectral Cameras: ‘New Eyes’ for Precision Agriculture, presented by Wouter Charle, manager for imec’s hyperspectral imaging technology, highlights how the company’s imaging technology and video-rate snapshot cameras are bringing spectral imaging innovation from the lab into the field by looking beyond plant phenotyping.
“With plant phenotyping the aim is to predict or look at how a certain plant with a certain genome interacts with its environment,” said Charle.
Spectral imaging provides a lot of information about the plant by investigating how the light is reacting with it, he said. “For instance, a plant will get its energy from absorbing red light [around 600 nm] for photosynthesis, and if we can see how efficiently red light is being absorbed by the plant, we can also tell something about how healthy this plant is functioning.”
Charle said hyperspectral imaging provides a window into a plant’s metabolism. One example is how stressed the plant is under drought conditions.
But there are many other parameters that hyperspectral imaging can provide about the plant’s cellular structure or other composites of the plant, he said.
With this information, a machine-learning model can be created to predict the health of the plant. “If a plant has a certain deviation from how it should look, we can tell if a plant is sick and we can also determine which disease the plant is suffering from and how it will react and that enables us to anticipate these events to make better decisions in the field.”
The goal of this research is to provide a predictive value to achieve a better yield and healthier plants, which translates into economic benefits, Charle said. This means optimizing environmental conditions for plant growth, predicting the evolution of plant growth (diseases), and making better decisions ahead of time, in addition to increasing yield and profitability of crops.
Imec is contributing to this research with its SNAPSCAN range of hyperspectral imaging cameras that cover the visual, NIR, and shortwave IR ranges. The technology innovator has developed a chip with integrated hyperspectral functionality thanks to interference-based optical filters at the wafer level, by depositing and patterning them on top of the image sensor pixels. This enables the imagers to be cost-effectively mass-produced cost in a small form factor, which can be integrated into a “regular” camera with standard lenses.
However, there are a lot of challenges in bringing the research into the field, according to Charle. The three key challenges are implementation, measurement technology, and the environment and the plant, itself.
For implementation, it’s difficult to scale up from the lab because of its confined space, the infrastructure can be more expensive than in the field, and it doesn’t require a high throughput, explained Charle. In comparison, there is a high density of plants and a large throughput requirement in the field.
For measurement technology used in the lab, it requires a high volume of data and associated processing, and the equipment cost can be higher. “In the field you will need to process all of this data, or you will need to store it, which means you will need even more processing power or storage, or you will need to deal with it in another way,” he said.
The environment between the lab and in the field also is different. This is related to the combination of the plant “being a natural being and so unpredictable in form and shape and how it will grow,” said Charle. “Moreover, the plant is rotating to enable a scan or to scan it from different sides and the plant will start moving, making it even more complex to scan.”
As a result, there is a lot of research and investigation into how to deal with these constraints, said Charle. One such study, conducted by Joseph Peller at the Wageningen University shows how imec’s SNAPSCAN VNIR hyperspectral imaging camera can detect plant diseases by visualizing the difference between a healthy crop and the reflective spectrum of a diseased crop.
But what the researchers found really interesting, said Charle, is that “by doing a linear discriminant analysis on these spectra it showed that with just a few wavelengths it could actually have sufficient predictive power to tell which plants were diseased and which were not.” He said the largest differences were explained by six to eight different wavelengths and a non-scientific CMOS image sensor, which provides more than sufficient SNR. In addition, the trained deep-learning model with a limited dataset delivered >80% precision.
Why is this important? Charle said it negates the need for the rich data set from scientific instruments to achieve a certain predictive power and it showed with these wavelengths there is a sufficient increase in contrast that can outperform traditional methods.
The spectral data can significantly improve on the predictive power of RGB with AI, getting useful data out of just a regular sensor, he said. “Capturing the right data at the right wavelength is sufficient to have the right predictive power and he [Peller] illustrated that with a machine-learning model.”
Imec’s spectral filters
Imec’s contribution to the research is a different way of implementing spectral imaging compared with traditional devices – by implementing spectral filters directly onto the image sensor pixels. The company uses mosaic patterning technology to achieve the video-mode hyperspectral imaging where the pixels are grouped int 3 × 3, 4 × 4 or 5 × 5 arrays. For example, there are 16 pixels in a 4 × 4 mosaic pattern, which process 16 filters, so each filter is the size of one imager pixel.
“We are post-processing wafers with imaging devices by adding layers on these wafers and patterning them to have spectral filters directly implemented at the single-pixel level,” he said.
“This is done by depositing what we call a resonant filter on each individual pixel and by tuning that filter we can determine which light or which wavelengths will pass through which pixel,” said Charle.
“Every pixel in that pattern will filter a different wavelength and by repeating this pattern over the full image sensor we will be sampling each individual wavelength over the full sensor image. Now the impact of that is that we can massively simplify and miniaturize existing spectral imaging systems,” he added.
“If we compare it to traditional spectral systems that are line-scanning devices [scanning a scene line by line], we are not filtering the light by using complex optical stacking in front of a sensor, we are filtering a light directly on a sensor,” he explained. “That means that we don’t need all of these other optical components in our imaging system, instead we are implementing everything on-chip, making a simple sensor that can go into a regular camera with standard lenses like an older machine-vision camera.”
Charle said imec’s technology very easily integrates into existing applications, and because it uses a standard CMOS process, it can very easily scale and provide repeatable production of the sensor at a cost-effective price for volume applications.
This enables video-rate imagers which is a true innovation for spectral imaging today, said Charle. “With spectral video, you can cope much more with the variability and dynamics in the scene because one image is directly capturing the spatial spectral information at once. By integrating everything together, we make it very miniaturized and robust, which is also important for in-the-field applications where you cannot use sensitive lab equipment.”
Detecting fireblight in orchards
Imec collaborated with pcfruit and VITO NV in Belgium, along with other partners, in a smart farming 4.0 project with the goal of detecting fireblight in orchards using spectral imaging. Fireblight can very rapidly spread and infect not only the entire orchard but also surrounding orchards, said Charle.
“The only resolution to fireblight that has spread is to destroy the orchard where it occurs. Now, the good news is if you can detect it early on, then the farmer can just simply cut it away from the tree. If only he is able to spot it.”
“It has been shown by our partners at VITO that you can detect fireblight by spectral signature,” said Charle. “We had the snapshot imager available with just the right wavelengths to detect that signature.
In this project, one of the partners implemented two of our snapshot RedNIR cameras on a tractor platform and then by driving around and capturing the data from the trees in the orchard, the machine-learning algorithms that were also developed in the context of this project were able to – with sufficiently high contrast – predict and tell where this fireblight was occurring on which trees in the orchard,” he said.
Charle said for smart farming applications “we need to think out of the box” on how to take the research done in lab conditions into the field to solve the application problem in a practical way.
The level of research data volume is not required for applications in the field, and what is important is getting the right spectral bands for that application, he added.
Imec offers multiple ways for companies to collaborate with them, from custom filter designs on a custom chip to off-the-shelf sensors and cameras. These services are available via imec and its partners.
Learn more about Imec