In 2020, Korolev Samara University researchers have won a grant for development of smart technology for agriculture from the Russian Science Foundation (RSF) to improve crop farming accuracy and profitability. The grant will be spent on development of a “vision” system for various agricultural machinery to employ AI in helping the agricultural producer quickly resolve many day-to-day tasks, e.g. assess land fertility level, determine soil moisture degree, required amount of fertilizers, foci of pest beetles in crops and presence of diseases in plants. In the interview below, Nikolay Kazansky, Samara University Professor of the Department of Engineering Cybernetics, explains how technology initially designed for space can be in demand on Earth.
RSF grant won by the Department’s researchers was titled “System for monitoring agricultural indicators in visible, infrared and hyperspectral imaging modes”. What was the research about, essentially?
Let me give you some background first. Previous to the grant, the Department’s specialists had been engaged in a major space project for development of hyperspectral hardware for small spacecraft (SSC) platform. Among those was Aist-2, a universal SSC platform developed in partnership between Samara University and RSC Progress. Such satellites are up to 10 times lighter than traditional Earth remote sensing suites. The requirements to their primary and secondary equipment are unsurprisingly high.
Our task was to provide the satellite with “vision”, and we solved it by designing a compact space hyperspectrometer and developing the methods for hyperspectral information processing. In doing so, we have optimized the weight (down to 5 kg) and energy consumption of the equipment for SSC ten times down versus the equipment mounted on “big” satellites. Energy consumption is an important parameter for spacecraft “fed” on energy from solar cell panels. By comparison, functionally similar equipment on Resurs-P No. 2 weighs around 300 kg.
After the project was completed, we started to look into other applications for the equipment we have designed. E.g. it could be installed on unmanned air drones or robots. A robot does not require a human, “chromatic” kind of image. It only needs to “see” objects in the range necessary for solving a dedicated task.
What are the possible applications of hyperspectral information?
Hyperspectral image helps visualize many things that are not discernable by a human eye on a black-and-white or color picture. Generally, various wavelengths are used by many representatives of the fauna. For instance, seagulls can assess the weather to develop a special film on their eyes to see through water and discover fish on a certain depth which they can reach by diving. Some species of butterflies demonstrate their exquisite wing ornaments to a potential partner only on a certain wavelength, in the range invisible for a human eye. So, in some sense, we make technology “learn” from the opportunities available in nature. Our developments had found their audience and thus we teamed up with a new partner in research, the Institute for Melioration Problems of the Academy of Agricultural Sciences, Novocherkassk, Rostov Region.
What sparked the Institute’s interest?
Opportunities of our hyperspectral equipment in determining soil moisture level. The point is that moisture is very clearly visible at certain wavelengths of near infrared range. A regular image of soil cannot help in determining whether this place should be watered or there is enough water in the soil. Conversely, in the near IR range this question can be very easily answered if the image is in the so-called SWIR range.
What are the practical benefits of the technology?
An opportunity to equip e.g. an irrigating system with hyperspectral equipment. Using hyperspectral data, AI can instantly determine whether a certain sector of the field needs watering or not, and activate or deactivate dedicated nozzles of the irrigating system. Besides, we will be able to “see” if something grows on a certain patch of land, to avoid wasting water e.g. if no seeds have sprouted where they had been planted earlier. This technology saves costs; we are speaking about smart agriculture, and that would be a huge step forward for our nation. Our technology can go even beyond that; it can “visualize” kinds of plant diseases in characteristic wavelength ranges and identify missing ingredients in soils and plants.
Compared to space imaging which enables shooting at various altitudes and resolutions, your technology is completely “down-to-Earth”.
Not everything can be seen from space. Some of the wavelengths essential for recognition are absorbed by the Earth’s atmosphere; some objects are too small. Take a cucumber; it is indiscernible among the surrounding greenery, but can be clearly visualized in the near IR range as an object consisting of 90% water. Automatic harvesting can be performed by a neural network that can be trained to search for cucumbers under the leaves and against their background, and assess whether the vegetable is ripe or not, to pick it or leave it, respectively. There are many agriculture-related tasks, and they are often quite creative.
Besides, for agricultural applications it is important that the technology is cost effective and reliable. It means that the price of our equipment should only make up a small percentage of the cost of the entire irrigating system. This is not a scientific task because from the point of view of physics, it doesn’t matter how you discover the ninth planet, through calculation or using telescope. The point is that it is discovered, and it doesn’t matter how much money was spent in the process, how many proton synchrotron accelerators were built. Cost saving is not a fundamental task in science, but vital in agriculture.
Will the University’s scientists take part in software development for “training” the future hardware?
They will. Without the soft part, the hard part will never be able to discern what exactly it has filmed. The neural network needs to be trained to analyze how much phosphorus is lacking in the soil, or to determine which species of microbe or virus has attacked your plant, and propose relevant treatment. For the neural network, the situation may appear as a dip in the wavelengths typical for e.g. phosphorus.
Are there any candidate manufacturers of hyperspectral equipment for agriculture?
In terms of core elements of the system, and these are optics and equipment assembly, Samara University will be in charge. Casing can be manufactured by other project parties. Our research is multidisciplinary, and involves the Institute for Melioration Problems and A. A. Kharkevich Institute for Information Transmission Problems of the Russian Academy of Science whose scientists specialize in artificial intelligence and intelligent analysis of color and multi-spectral images. They will work in these project areas together with the teams of the Department of Engineering Cybernetics and Department of Supercomputers and General Informatics of our University.