Abstract:Cashew is one of the most extensively consumed nuts in the world, and it is also known as a cash crop. A tree may generate a substantial yield in a few months and has a lifetime of around 70 to 80 years. Yet, in addition to the benefits, there are certain constraints to its cultivation. With the exception of parasites and algae, anthracnose is the most common disease affecting trees. When it comes to cashew, the dense structure of the tree makes it difficult to diagnose the disease with ease compared to short crops. Hence, we present a dataset that exclusively consists of healthy and diseased cashew leaves and fruits. The dataset is authenticated by adding RGB color transformation to highlight diseased regions, photometric and geometric augmentations, and RaLSGAN to enlarge the initial collection of images and boost performance in real-time situations when working with a constrained dataset. Further, transfer learning is used to test the classification efficiency of the dataset using algorithms such as MobileNet and Inception. TensorFlow lite is utilized to develop these algorithms for disease diagnosis utilizing drones in real-time. Several post-training optimization strategies are utilized, and their memory size is compared. They have proven their effectiveness by delivering high accuracy (up to 99%) and a decrease in memory and latency, making them ideal for use in applications with limited resources.
Abstract:Human activity recognition is one of the most important tasks in computer vision and has proved useful in different fields such as healthcare, sports training and security. There are a number of approaches that have been explored to solve this task, some of them involving sensor data, and some involving video data. In this paper, we aim to explore two deep learning-based approaches, namely single frame Convolutional Neural Networks (CNNs) and convolutional Long Short-Term Memory to recognise human actions from videos. Using a convolutional neural networks-based method is advantageous as CNNs can extract features automatically and Long Short-Term Memory networks are great when it comes to working on sequence data such as video. The two models were trained and evaluated on a benchmark action recognition dataset, UCF50, and another dataset that was created for the experimentation. Though both models exhibit good accuracies, the single frame CNN model outperforms the Convolutional LSTM model by having an accuracy of 99.8% with the UCF50 dataset.
Abstract:The use of unmanned aerial vehicles (UAV) is revolutionizing the agricultural industry. Cashews are grown by approximately 70% of small and marginal farmers, and the cashew industry plays a critical role in their economic development. To take timely counter measures against plant diseases and infections, it is imperative to monitor and detect diseases as early as possible and take suitable measures. Using UAVs, such as those that are equipped with artificial intelligence, can assist farmers by providing early detection of crop diseases and precision pesticide application. An edge computing paradigm of Artificial Intelligence is employed to process this image in order to make decisions with the least amount of latency possible. As a result of these decisions, the stage of infestation, the crops affected, the method of prevention of spreading the disease, and what type and amount of pesticides need to be applied can be determined. UAVs equipped with sensors detect disease patterns quickly and accurately over large areas. Combined with AI algorithms, these machines can analyse data from a variety of sources such as temperature, humidity, CO2 levels and soil composition. This allows them to recognize disease symptoms before they become visible. Early detection allows for more effective control strategies that can reduce costs caused by lost production due to infestations or crop failure. Using an end-to-end training architecture, mobileNetV2 determines how to classify anthracnose disease in cashew leaves. A standard PlantVillage dataset is used for performance evaluation and for standardization. Additionally, samples captured with a drone present a variety of image samples captured in a variety of conditions, which complicates the analysis. According to our analysis, we were able to identify the anthracnose with 95% accuracy and the healthy leaves with 99% accuracy.