Precision agriculture is expected to meet future human demands for food, feed, fiber, and fuel while maintaining sustainability and environmental friendliness. Precision agriculture largely relies on sensors and corresponding data processing technologies to provide decision support for planting, irrigation, fertilization, pesticide application, and harvesting. Compared with traditional point sensors, imaging sensors combined with artificial intelligence models have a stronger capability to measure crop canopy-related parameters. However, the harsh field environment limits the computational power, electricity, and network connectivity, hindering the full utilization of this technology. This paper reports on AICropCAM, a system that integrates field image acquisition, edge deep image processing, IoT, and low-power long-range communication technologies. The core of AICropCAM is a stack of four deep convolutional neural network (DCNN) models running sequentially, where CropClassiNet is used for crop type classification, CanopySegNet for canopy coverage quantification, PlantCountNet for counting plants and weeds, and InsectNet for insect recognition. These DCNN models are supervised trained and tested on a large number of field crop images. This system is built on a distributed wireless sensor network, with sensor nodes including RGB cameras for image acquisition, Raspberry Pi 4B single-board computers for edge image processing, and Arduino MKR1310 for LoRa communication and power management. Test results show that the time and power consumption for running DCNN models range from 0.20 seconds (3.68 watts) for InsectNet to 20.20 seconds (5.83 watts) for CanopySegNet. The classification model CropClassiNet and the segmentation model CanopySegNet reported accuracies of 94.50% and 92.83%, respectively. The average precision reported by the two object detection models, PlantCountNet and InsectNet, were 0.69 and 0.02, respectively. The model prediction results are then transmitted to the ThingSpeak IoT platform for data visualization and analysis. This study demonstrates that AICropCAM successfully implements edge deep image processing, significantly reducing the amount of data transmitted, and is capable of meeting the real-time decision-making needs of precision agriculture. AICropCAM can be deployed on mobile platforms, such as circular irrigation machines or drone platforms, to increase its spatial coverage and resolution, providing decision support for crop monitoring and field operations.
Fig. 1. Steps of edge image processing program deployment on the embedded system (edge devices).
Fig. 2. Left: An Illustration of how AICropCAM was set up in the field for image collection. In addition to the camera, other components such as the solar panel and data logger were also shown. Right: A close-up view of AICropCAM and its hardware components.
Fig. 4. Hardware overview of AICropCAM and data flow.
Fig. 5. Overall sequential image processing and data generation flow chart.
Fig. 6. Examples of message generation and data size reduction for LoRa transmission.
Nipuna Chamara, Geng Bai, Yufeng Ge, 2023. AICropCAM: Deploying classification, segmentation, detection, and counting deep-learning models for crop monitoring on the edge. Computers and Electronics in Agriculture 215, 108420.
Author Introduction
Nipuna Chamara, University of Nebraska-Lincoln, Department of Biological Systems Engineering
Geng Bai, University of Nebraska-Lincoln, Department of Biological Systems Engineering
Yufeng Ge, University of Nebraska-Lincoln, Department of Biological Systems Engineering
Further Reading
-
Plant Phenotyping Information Directory Summary for 2018 -
Plant Phenotyping Information Directory Summary for 2019 -
Plant Phenotyping Information Directory Summary for 2020 -
Plant Phenotyping Information Directory Summary for 2021 -
Plant Phenotyping Information Directory Summary for 2022 -
Plant Phenotyping Information Directory Summary for 2023 -
Plant Phenotyping Information Directory Special Collection Access -
What Does the Best-Selling Field Phenotyping Platform Look Like?

Click here to read the original text