Crop-Planting Area Prediction from Multi-Source Gaofen Satellite Images Using a Novel Deep Learning Model: A Case Study of Yangling District
Xiaofei Kuang, Jiao Guo, Jingyuan Bai, Hongsuo Geng, Hui Wang- General Earth and Planetary Sciences
Neural network models play an important role in crop extraction based on remote sensing data. However, when dealing with high-dimensional remote sensing data, these models are susceptible to performance degradation. In order to address the challenges associated with multi-source Gaofen satellite data, a novel method is proposed for dimension reduction and crop classification. This method combines the benefits of the stacked autoencoder network for data dimensionality reduction, and the convolutional neural network for classification. By leveraging the advantages of multi-dimensional remote sensing information, and mitigating the impact of dimensionality on the classification accuracy, this method aims to improve the effectiveness of crop classification. The proposed method was applied to the extraction of crop-planting areas in the Yangling Agricultural Demonstration Zone, using multi-temporal spectral data collected from the Gaofen satellites. The results demonstrate that the fusion network, which extracts low-dimensional characteristics, offers advantages in classification accuracy. At the same time, the proposed model is compared with methods such as the decision tree (DT), random forest (RF), support vector machine (SVM), hyperspectral image classification based on a convolutional neural network (HICCNN), and a characteristic selection classification method based on a convolutional neural network (CSCNN). The overall accuracy of the proposed method can reach 98.57%, which is 7.95%, 4.69%, 5.68%, 1.21%, and 1.10% higher than the above methods, respectively. The effectiveness of the proposed model was verified through experiments. Additionally, the model demonstrates a strong robustness when classifying based on new data. When extracting the crop area of the entire Yangling District, the errors for wheat and corn are only 9.6% and 6.3%, respectively, and the extraction results accurately reflect the actual planting situation of crops.