Area Estimation of Mango and Coconut Crops using Machine Learning in Hesaraghatta Hobli of Bengaluru Urban District, Karnataka
DOI:
https://doi.org/10.58825/jog.2023.17.1.75Keywords:
Mango, Coconut, Area estimation, Machine learning, Convolutional Neural Network, Random Forest (RF)Abstract
Timely and accurate estimation of acreage and production of horticulture crops is necessary for deciding how much, where and when to export these commodities in the national and global markets. Remote sensing has been one of the methods adopted, in addition to conventional sampling methods, for improving the estimates. Parametric image classification algorithms have been used by many researchers for identification and area estimation of horticulture crops. But these algorithms result in several unclassified pixels leading to over/underestimates. This study has been undertaken to estimate the area of two horticulture crops (i.e., mango and coconut) of Hesaraghatta hobli of Bengaluru urban district using Convolutional Neural Network (CNN) on Google Colaband Random Forest (RF) algorithms on Google Earth Engine (GEE). Remotely sensed data acquired by the Multi-Spectral Instrument (MSI) onboard Sentiel-2A satellite was used. Spectral signatures of horticulture crops and other associated cover types have been generated to identify the cover types and for selecting appropriate band combinations. Two different band combinations were used for area estimation of selected horticulture crops: i) Near-InfraRed (NIR), Red, and Green all three having a spatial resolution of 10 m, ii) Red edge-3, Short-Wave InfraRed1 (SWIR1) and Short-Wave InfraRed2 (SWIR2) having 20 m spatial resolution. Area estimates of horticulture crops and associated cover types were validated with respect to ground truth and statistical reports from Karnataka State Directorate of Horticulture (KSDH). It was found that the CNN model performed better than RF using NIR, Red, and Green band combination with an overall accuracy of 84%, but it failed to give similar accuracies with Red edge 3, SWIR1, and SWIR2 band combination. We attempted transfer learning using the trained CNN model at two different study areas far away from the study area and found encouraging results.
References
Bahrami, H., Homayouni, S., Safari, A., Mirzaei, S., Mahdianpari, M., & Reisi-Gahrouei, O. (2021). Deep Learning-Based Estimation of Crop Biophysical Parameters Using Multi-Source and Multi-Temporal Remote Sensing Observations. Agronomy, 11(7), 1363-1385. https://doi.org/10.3390/agronomy1107136
Belgiu, M., & Drăguţ, L. (2016). Random forest in remote sensing: A review of applications and future directions. ISPRS Journal of Photogrammetry and Remote Sensing, 114, 24-31.
Blaschke, T. (2010). Object-Based Image Analysis for Remote Sensing. ISPRS Journal of Photogrammetry and Remote Sensing, 65(1), 2-16.
Carranza-García, M., García-Gutiérrez, J., & Riquelme, J.C. (2019). A Framework for Evaluating Land Use and Land Cover Classification Using Convolutional Neural Networks. Remote Sensing, 11(3), 274.
Chaudhari, K., Nishant, N., Upadhyay, G., More, R., Singh, N., Vyas, S.P., & Bhattacharya, B.K. (2019). Crop Inventory of Orchard Crops in India Using Remotely Sensed Data. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XLII-3/W6, 269-275.
Chouhan, A., Sur, A., & Chutia, D. (2022). Aggregated Context Network for Semantic Segmentation of Aerial Images. IEEE International Conference on Image Processing (ICIP), Bordeaux, France, October 16-19, 2022, pp. 1526-1530. https://doi.org/10.1109/ICIP46576.2022.9898016
Congalton, R. G., & Green, K. (1999). Assessing the Accuracy of Remotely sensed data: Principles and Practices. Lewis Publishers, Boca Raton, London and New York.
Duro, D.C., Franklin, S.E., & Dube, M.G. (2012). A Comparison of Pixel-based and Object-based Image Analysis with Selected Machine Learning Algorithms for the Classification of Agricultural Landscapes using SPOT-5 HRG Imagery. Remote Sensing of Environment. 118, 259-272.
Ge, S., Zhang, J., Pan, Y., Yang, Z., & Zhu, S. (2021). Transferable deep learning model based on the phenological matching principle for mapping crop extent. International Journal of Applied Earth Observation and Geoinformation, 102, 102451.
https://doi.org/10.1016/j.jag.2021.102451
Handique, B.K., Goswami, C., Jena, P., Dutta, F., Samiam, R., Nongrum, I., Jha, D., Raju, P.L.N., Deka, C.R., Sarma, R., Sarmah, K., Dutta, H. K., Deb, S., Yari, B., Maibam, S., Thanpuii, H., Sailo, R., Muansangi, V., Katiry, D., Medo, T., Sharma, N.P., Lepcha, B., Roy, A., Bhattacharya, B.K., Patel, J.G., Singh, C.P., Manjunath, K.R., Kimothi, M.M., Mamatha, S., Kumar, P., Tahlani, P., & Ray, S. S. (2021). Applications of Advanced Geospatial Technology for Expansion of Area under Horticultural Crops in North-eastern Region of India. Journal of Indian Society of Remote Sensing, 50, 331-345.
Handique B.K., Goswami, C., Gupta, C., Pandit, S., Gogoi, S., Jadi, R., Jena, P., Borah, G. & Raju, P.L.N. (2020). Hierarchical Classification for Assessment of Horticultural Crops in Mixed Cropping Pattern Using UAV-Borne Multi-Spectral Sensor. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XLIII-B3-2020XLIII-B3-2020, pp. 67-74.
Helber, P., Bischke, B., Dengel, A., & Borth, D. (2019). EuroSAT: A novel dataset and deep learning benchmark for land use and land cover classification. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, 12(7), 2217-2226.
Jiang, D., Yang, X., Clinton, N., & Wang, N. (2004). An Artificial Neural Network Model for Estimating Crop Yields using Remotely Sensed Information. International Journal of Remote Sensing, 25(9), 1723-1732.
Kavzoglu, T, & Mather, P.M. (2002). The role of feature selection in artificial neural network applications. International Journal of Remote Sensing, 23(15), 2919-2937.
Khemani, D. (2013). A First Course in Artificial Intelligence, McGraw Hill Education (India) Private Ltd, Chennai.
Liu, Y., Fan, B., Wang, L., Bai, J., Xiang, S., & Pan, C. (2018). Semantic labeling in very high resolution images via a self-cascaded convolutional neural network. ISPRS Journal of Photogrammetry and Remote Sensing. 45 (Part A), 78-95.
Lu, D., & Weng, Q. (2007). A Survey of Image Classification Methods and Techniques for Improving Classification Performance. International Journal of Remote Sensing, 28(5), 823-870.
McIver, D.K., & Friedl, M.A. (2002). Using Prior Probabilities in Decision-tree Classification of remotely sensed data. Remote Sensing of Environment, 81, 253-261.
Moolayil, J. (2019). Learn Keras for Deep Neural Networks, Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-4240-7_1
Nageswara Rao, P.P., Ravishankar, H.M., Raj, U., & Nagajothi, K. (2004). Production estimation of horticulture crops using IRS-1D LISS-III data. Journal of the Indian Society of Remote Sensing, 32(4), 393-398.
Kussul, N., Lavreniuk, M., Skakun, S., & Shelestov, A. (2017). Deep Learning Classification of Land Cover and Crop Types Using Remote Sensing Data. IEEE Geoscience and Remote Sensing Letters, 14(5), 778-782.
Paul, N.C., Sahoo, P.M., Ahmad, T., Sahoo, R.N., Krishna, G., & Lal, S.B. (2018). Acreage estimation of mango orchards using hyperspectral satellite data. Indian Journal of Horticulture, 75(1), 27-33.
Qiu, F., & Jensen, J.R. (2004). Opening the black box of neural networks for remote sensing image classification. International Journal of Remote Sensing, 25(9), 1749-1768.
Ray, S.S., Mamatha, S., Kimothi, M.M., Kumar, P., Sehgal, S., Manjunath, K.R., Bhattacharya, B.K., Chaudhary, K.N., Raj, U., Hebbar, K.J., Murthy, C.S., Kameswara R., S.V. C., Raju, P.L.N., Handique, B.K., Goswami, C., Sharma, H.P., Singh, K.K., Upadhyay, A.K., & Saxena, M. (2018). Horticultural crops assessment and development using remote sensing. In K.L. Chadha, S.K. Singh, J. Prakash, V.B. Patel (Eds.), Shaping the future of horticulture, Kruger Brentt (1st ed.), pp. 609–623.
Ronneberger, O., Fischer, P., & Brox, T. (2015). U-net: Convolutional networks for biomedical image segmentation. Medical Image Computing and Computer-Assisted Intervention (MICCAI), Springer, LNCS, 9351: 234-241.
Siddique, N., Paheding, S., Elkin, C.P., & Devabhaktuni, V. (2021). U-net and its variants for medical image segmentation: A review of theory and applications. IEEE Access, 9, 82031-82057.
Singh, G., Singh, S., Sethi, G., & Sood, V. (2022). Deep learning in the mapping of agricultural land use using Sentinel-2 satellite data. Geographies, 2(4), 691-700. https://doi.org/10.3390/geographies2040042
Shelhamer, E., Long, J., & Darrell, T. (2017). Fully Convolutional Networks for Semantic Segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence. 39 (4), 640–651.
Snighal, M., Payal, A., & Kumar, A. (2022). Study of CNN deep learning model for temporal remote sensing data processing to map rabi crops. Journal of Geomatics, 16(2), 167-175.
Stephen, S., Haldar, D., & Patel, N.R. (2022). Impact of various Vegetation Indices on Mango orchard mapping using Object-Based Image Analysis. Journal of Geomatics, 16(2), 159-166.
Teluguntla, P., Thenkabail, P.S., Oliphant, A., Xiong, J., Gumma, M.K., Congalton, R.G., Yadav, K., & Huete, A. (2018). A 30-m landsat-derived cropland extent product of Australia and China using random forest machine learning algorithm on Google Earth Engine cloud computing platform. ISPRS Journal of Photogrammetry and Remote Sensing, 144, 325-340.
Verma, D., & Jana, A. (2019). LULC classification methodology based on simple Convolutional Neural Network to map complex urban forms at finer scale: Evidence from Mumbai. arXiv preprint arXiv:1909.09774.
Yadav, I.S., Srinivasa Rao, N.K., Reddy, B.M.C., Rawal, R.D., Srinivasan, V.R., Sujatha, N.T., Bhattacharya, C., Nageswara Rao, P.P., Ramesh, K.S., & Elango, S. (2002). Acreage and production estimation of mango orchards using Indian remote sensing (IRS) satellite data. Scientia Horticulture, 93(2), 105–123.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2023 Journal of Geomatics
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.