"Hybrid Ensemble Learning with CNN and RNN for Multimodal Cotton Plant " by Anita Shrotriya, Akhilesh Kumar Sharma et al.
 

Hybrid Ensemble Learning with CNN and RNN for Multimodal Cotton Plant Disease Detection

Document Type

Article

Publication Title

IEEE Access

Abstract

In agriculture, accurate and timely detection of plant diseases is crucial for minimizing crop losses and ensuring food security. Traditional methods of plant disease detection often rely on visual inspection and single-modal data analysis, which can be limited in their diagnostic accuracy. This study introduces an innovative ensemble learning framework that integrates Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs) for multimodal plant disease detection to address these limitations. The proposed framework capitalizes on the strengths of both CNNs and RNNs by processing visual data from leaf images and sequential data such as time-series measurements of environmental conditions. CNNs are adept at extracting intricate spatial features from leaf images, identifying visual symptoms of diseases with high precision. Concurrently, RNNs are designed to capture temporal patterns in sequential data, providing insights into environmental factors that may influence disease development. The ensemble method employed in this study aggregates predictions from both CNN and RNN models using techniques such as majority voting and weighted averaging. Majority voting involves combining the outputs of multiple models to make a final prediction based on the most common outcome, while weighted averaging assigns different weights to the predictions of each model based on their performance, leading to a more balanced and accurate diagnostic result. Experimental evaluations were conducted on comprehensive multimodal datasets, including diverse plant species and varying environmental conditions, to assess the effectiveness of the proposed framework. The results demonstrate that the ensemble approach significantly outperforms individual CNN and RNN models, achieving higher diagnostic accuracy, precision, recall, and F1 scores. This superior performance underscores the potential of integrating diverse data streams to provide a holistic view of plant health, enabling more accurate and reliable disease diagnosis. The findings of this study highlight the importance of leveraging multimodal data and advanced machine-learning techniques in plant disease detection. By integrating spatial and temporal information, the proposed framework offers a comprehensive diagnostic tool that can be instrumental in improving agricultural practices, optimizing plant health management, and ultimately contributing to sustainable farming practices.

DOI

10.1109/ACCESS.2024.3515843

Publication Date

1-1-2024

This document is currently not available here.

Share

COinS