Scalambrin, Luca (2021) Aprendizaje profundo en clasificación de cultivos en imágenes satelitales / Deep learning in crop classification in satellite imagery. Proyecto Integrador Ingeniería en Telecomunicaciones, Universidad Nacional de Cuyo, Instituto Balseiro.
| PDF (Tesis) Español 17Mb |
Resumen en español
En países donde la agricultura posee un rol fundamental en lo que respecta a su economía, resulta de vital importancia el planeamiento estratégico para la correcta utilización de los territorios y recursos disponibles. Para ello, el sensado remoto es una gran herramienta que, en combinación con otras tecnologías, ha permitido lograr grandes avances en lo que respecta a la automatización de los procesos implicados en la gestión de recursos. En el presente trabajo se desarrolló un fujo de trabajo que permite llevar a cabo el proceso de clasificación de cultivos mediante el uso de técnicas de inteligencia artificial e imágenes satelitales de la misión Sentinel-2 [1], lo cual puede contribuir en la toma de decisiones y, a su vez, permitiría ayudar en la adopción de prácticas que garanticen la preservación del suelo y del medio ambiente. En este proceso se implementó un segmentador y un clasificador basados en arquitecturas de redes neuronales profundas que probaron ser parte del estado del arte en sus respectivas áreas de desarrollo. Para el segmentador se implementó una red conocida como ResUNet-a [2], la cual posee una estructura de encoding-decoding que deriva de la típica red U-Net que ha probado ser muy versátil en lo que respecta a la segmentación [3{6]. Dicha red fue entrenada con la región correspondiente a la zona de Pergamino en la provincia de Buenos Aires, con la cual se obtuvo un accuracy del 85.8% y un Coeficiente Dice(ver sección 2.3.4) del 73 %, logrando una contribución positiva a resultados obtenidos previamente por INVAP-FRONTEC utilizando métodos clásicos (random forest, principalmente). Esto es de particular interés porque suplementa los métodos previos con un enfoque desde el campo del Aprendizaje Profundo y permitiría a INVAP-FRONTEC realizar inferencias de mayor robustez. Luego, este modelo permitió segmentar la región de General López de la provincia de Santa Fe, debido a que los cultivos del dataset a emplearse en la clasificación corresponden a esta última región. Para llevar a cabo la clasificación se implementó una arquitectura innovadora propuesta en [7] que, esencialmente, posee tres bloques fundamentales, los cuales permiten realizar una extracción de features a través de correlaciones temporales y espectrales, para luego realizar la clasificación. A partir de esta red y de la segmentación previa, fue posible la implementación de 3 procesos diferentes para la aumentación de datos. En primer lugar se utilizaron únicamente los puntos a clasificar, mientras que en los otros dos casos se dio uso, además, de los segmentos obtenidos previamente con el segmentador. Si bien los resultados obtenidos fueron similares a los obtenidos en la competencia de la Fundación Sadosky de la cual se obtuvieron los datos [8], considerando la métrica allí empleada, el mejor desempeño se obtuvo a partir del proceso en el cual se realizó un sampleo aleatorio dentro de los segmentos correspondientes a los diferentes campos de cultivos (ver sección 3.4.2). Con este ultimo proceso se logró un accuracy del 83.4% y un balanced accuracy de 57 %, el cual es superior al obtenido por el resultado ganador de la competencia [8]. Finalmente, un ultimo análisis permitió concluir que la banda NIR junto con las 3 bandas de la región visible son, al menos para el set de datos aquí empleado, las bandas que representan la información más relevante para clasificar cultivos.
Resumen en inglés
In countries where agriculture plays a fundamental role in regard to their economy, strategic planning is of vital importance for the correct use of available territories and resources. For this, remote sensing is a great tool that, in combination with other technologies, has made it possible to achieve great advances in terms of automating the processes involved in resource management. In the present work, a work flow was developed that allows carrying out the crop classication process through the use of articial intelligence techniques and satellite images of the Sentinel-2 mission [1], which can contribute to decision-making and , in turn, it will help to adopt practices that guarantee the preservation of the soil and the environment. In this process, a segmenter and a classier based on deep neural network architectures were implemented, which proved to be part of the state of the art in their respective development areas. For the segmenter, a network known as ResUNet-a was implemented [2], which has an encoding-decoding structure that derives from the typical U-Net network that has proven to be very versatile with regard to segmentation [3-6]. This network was trained with the region corresponding to the Pergamino area in the province of Buenos Aires and an accuracy of 85.8% was obtained, together with a Dice Coefficient (see section 2.3.4) of 73%, achieving a positive contribution to the results previously obtained by INVAP-FRONTEC using classical methods (mainly random forest). This is of particular interest because it supplements the previous methods with an approach from the eld of Deep Learning and will allow INVAP-FRONTEC to make more robust inferences. Then, this model allowed segmenting the General López region of the Santa Fe province, which corresponds to the territory that contains the elements of the dataset used for the classication process. To carry out the classication, an innovative architecture proposed in [7] was implemented, which essentially has three fundamental blocks, which allow to carry out an extraction of features through temporal and spectral correlations, and then carry out the classication. From this network and the previous segmentation results, it was possible to implement 3 different processes for data augmentation. In the first place, only the points to be classied were used, while in the other two cases, the segments previously obtained with the segmenter were also used. Although the results obtained were similar to those obtained in the Sadosky Foundation competition from which the data was obtained [8], considering the metric used there, the best performance was obtained from the process in which it was carried out a random sampling within the segments corresponding to the different crop elds (see section 3.4.2). With this last process, an accuracy of 83.4% and a balanced accuracy of 57% were achieved, which is higher than that obtained by the winning result of the competition. Finally, a last analysis allowed us to conclude that the NIR band together with the 3 bands of the visible region are, at least for the dataset used here, the bands that represent the most relevant information to classify crops.
Tipo de objeto: | Tesis (Proyecto Integrador Ingeniería en Telecomunicaciones) |
---|---|
Palabras Clave: | Machine learning; Aprendizaje automático; Remote sensing; Detección a distancia; [Satellite image; Imágenes satelitales; Image segmentation; Segmentación; Crop classification; Clasificación de cultivos; Deep learning; Aprendizaje profundo; Remote sensing; Sensado remoto] |
Referencias: | [1] ESA. Sentinel-2. https://sentinel.esa.int/web/sentinel/missions/ sentinel-2. xv, xvii [2] Diakogiannis, F. I., Waldner, F., Caccetta, P., Wu, C. Resunet-a: a deep learning framework for semantic segmentation of remotely sensed data. CoRR, abs/1904.00592, 2019. URL http://arxiv.org/abs/1904.00592. xv, xvii, 27, 28, 30, 82 [3] Ronneberger, O., Fischer, P., Brox, T. U-net: Convolutional networks for biomedical image segmentation. CoRR, abs/1505.04597, 2015. URL http: //arxiv.org/abs/1505.04597. xv, xvii, 27 [4] Siddique, N., Paheding, S., Elkin, C., Devabhaktuni, V. U-net and its variants for medical image segmentation: A review of theory and applications. IEEE Access, PP, 1-1, 06 2021. [5] Fernández, J. G., Mehrkanoon, S. Broad-unet: Multi-scale feature learning for nowcasting tasks. Neural Networks, 144, 419-427, 2021. URL https://www. sciencedirect.com/science/article/pii/S089360802100349X. [6] Fernández, J. G., Abdellaoui, I. A., Mehrkanoon, S. Deep coastal sea elements forecasting using u-net based models. CoRR, abs/2011.03303, 2020. URL https://arxiv.org/abs/2011.03303. xv, xvii [7] Mazzia, V., Khaliq, A., Chiaberge, M. Improvement in land cover and crop classification based on temporal features learning from sentinel-2 data using recurrentconvolutional neural network (r-cnn). Applied Sciences, 10 (1), 2020. URL https://www.mdpi.com/2076-3417/10/1/238. xv, xvii, 47, 57, 58, 59, 77, 78, 79, 83 [8] Meta:Data, F. S. Clasificación de cultivos utilizando imágenes satelitales. https: //metadata.fundacionsadosky.org.ar/competition/22/. xvi, xvi, xviii, 48, 82 [9] INTA. Mapa nacional argentino de cultivos. https://inta.gob.ar/sites/default/files/mapa_nacional_de_cultivos_campana_2018_2019.pdf. 1, 66 [10] Pacala, S., Socolow, R. Stabilization wedges: Solving the climate problem for the next 50 years with current technologies. Agronomy for Sustainable Development volume, 305, 968-972, 2004. 2 [11] Peña-Barragan, J. M., López-Granados, F., García-Torres, L., Jurado-Expósito, M., de la Orden, M. S., García-Ferrer, A. Object-based image classication of summer crops with machine learning methods. Remote Sensing, 6, 5019-5041, 2014. 2, 13 [12] Chuvieco, E. Fundamentals of satellite remote sensing. En: An Environmental Approach, tomo 1, pags. 1-4. CRC Press, 2020. 3, 5 [13] Maral, G., Busquet, M. Satellite communications systems. systems, techniques and technologies. Wiley, 5th Ed, p. 15-17, 2009. 7 [14] ESA. Sentinel-2 msi user guide. https://sentinel.esa.int/web/sentinel/ user-guides/sentinel-2-msi. 8 [15] ESA. Sentinel-2 overview. https://sentinel.esa.int/web/sentinel/missions/sentinel-2/overview. 8 [16] ESA. Multispectral instrument (msi). https://sentinel.esa.int/web/sentinel/missions/sentinel-2/instrument-payload. 9 [17] ESA. Bands and radiometric resolutions. https://sentinel.esa.int/web/ sentinel/user-guides/sentinel-2-msi/resolutions/radiometric. 9 [18] Online, E. S. Processing levels. https://sentinels.copernicus.eu/web/ sentinel/user-guides/sentinel-2-msi/processing-levels. 9 [19] Online, E. S. Product types. https://sentinel.esa.int/web/sentinel/ user-guides/sentinel-2-msi/product-types. 9 [20] Online, E. S. Products and algorithms. https://sentinels.copernicus.eu/ web/sentinel/technical-guides/sentinel-2-msi/products-algorithms. 9 [21] Geron, A. Hands-on machine learning with Scikit-Learn and TensorFlow concepts, tools, and techniques to build intelligentsystems. Paperback, abr. 2017. URL http://www.amazon.com/exec/obidos/redirect?tag=citeulike07-20&path=ASIN/1491962291. 10 [22] Francois, C. Deep Learning with Python. 1a editon. USA: Manning Publications Co., 2017. 12 [23] Lary, D. J., Alavi, A. H., Gandomi, A. H., Walker, A. L. Machine learning in geosciences and remote sensing. Geoscience Frontiers, 7, 3-10, 2016. 12 [24] Haykin, S. Neural networks and learning machines. 3a ed. Upper Saddle River, NJ, USA: Pearson Education, 2009. 16, 18 [25] Nwankpa, C., Ijomah, W., Gachagan, A., Marshall, S. Activation functions: Comparison of trends in practice and research for deep learning. CoRR, abs/1811.03378, 2018. URL http://arxiv.org/abs/1811.03378. 18 [26] Szeliski, R. Computer vision: Algorithms and applicactions. Springer, 2011. 22 [27] Minaee, S., Boykov, Y., Porikli, F., Plaza, A., Kehtarnavaz, N., Terzopoulos, D. Image segmentation using deep learning: A survey. CoRR, abs/2001.05566,2020. URL https://arxiv.org/abs/2001.05566. 23 [28] Chen, L., Papandreou, G., Schro, F., Adam, H. Rethinking atrous convolution for semantic image segmentation. CoRR, abs/1706.05587, 2017. URL http://arxiv.org/abs/1706.05587. 23, 28 [29] EOLearn. Sentinel hub - eo research team. https://eo-learn.readthedocs.io/en/latest/. 27 [30] Harris, C. R., Millman, K. J., van der Walt, S. J., Gommers, R., Virtanen, P., Cournapeau, D., et al. Array programming with NumPy. Nature, 585 (7825),357-362, sep. 2020. URL https://doi.org/10.1038/s41586-020-2649-2. 27 [31] Shapely. https://shapely.readthedocs.io/en/stable/. 27 [32] Sentinel-hub. Eoflow - sentinelhub. https://github.com/sentinel-hub/eo-flow. 28 [33] Chollet, F. keras. https://github.com/fchollet/keras, 2015. 28, 58 [34] Abadi, M., Agarwal, A., Barham, P., Brevdo, E., Chen, Z., Citro, C., et al. TensorFlow: Large-scale machine learning on heterogeneous systems, 2015. URL http://tensorflow.org/, software available from tensor flow.org. 28, 58 [35] Zhao, H., Shi, J., Qi, X., Wang, X., Jia, J. Pyramid scene parsing network. CoRR, abs/1612.01105, 2016. URL http://arxiv.org/abs/1612.01105. 28 [36] JupyterLab. https://jupyter.org. 31 [37] Waldner, F., Diakogiannis, F. I. Deep learning on edge: extracting eld boundaries from satellite images with a convolutional neural network. CoRR, abs/1910.12023, 2019. URL http://arxiv.org/abs/1910.12023. 33, 37, 84 [38] GDAL-Community. https://gdal.org/programs/gdal_contour.html. 37 [39] Zhong, L., Hu, L., Zhou, H. Deep learning based multi-temporal crop classication. Remote Sensing of Environment, 221, 430-443, 2019. 46 [40] ESA. Cloud masks. https://sentinel.esa.int/web/sentinel/technical-guides/sentinel-2-msi/level-1c/cloud-masks. 53 [41] development team, P. Pandas - llna method. https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.fillna.html. 53 [42] Tsaug library. https://tsaug.readthedocs.io/en/stable/. 54 [43] Le Guennec, A., Malinowski, S., Tavenard, R. Data Augmentation for Time Series Classication using Convolutional Neural Networks. En: ECML/PKDDWorkshop on Advanced Analytics and Learning on Temporal Data. Riva Del Garda, Italy, 2016. URL https://halshs.archives-ouvertes.fr/halshs-01357973. 54 [44] TensorFlow. Timedistributed layer. https://www.tensorflow.org/api_docs/python/tf/keras/layers/TimeDistributed. 59 [45] Learn, S. Ballanced accuracy. https://scikit-learn.org/stable/modules/generated/sklearn.metrics.balanced_accuracy_score.html. 60, 64 [46] Breiman, L. Bagging predictors. Machine Learning, 24 (2), 123-140, 1996. 60 [47] Ministerio de agricultura, p. y. a. E. Forrajes. https://www.mapa.gob.es/es/agricultura/temas/producciones-agricolas/cultivos-herbaceos/forrajes/. 66 [48] INTA. Forrajes y pasturas. https://inta.gob.ar/forrajes-y-pasturas. 66 [49] Letteer, C. R. C. R. Experiments in crop production on fallow land at San Antonio. Washington, D.C. :U.S. Dept. of Agriculture, 1914. URL https://www.biodiversitylibrary.org/item/191027, https://www.biodiversitylibrary.org/bibliography/108987. 66 [50] Kussul, N., Lavreniuk, M., Skakun, S., Shelestov, A. Deep learning classication of land cover and crop types using remote sensing data. IEEE Geoscience and Remote Sensing Letters, PP, 1-5, 03 2017. 84 [51] AFA. Rafa 2021, pag 518. http://rafa.fisica.org.ar/wp-content/uploads/2021/10/Rafa2021.pdf. 84 |
Materias: | Ingeniería en telecomunicaciones > Aprendizaje profundo Ingeniería en telecomunicaciones > Sensado remoto |
Divisiones: | INVAP > Area de análisis y sistemas complejos |
Código ID: | 1113 |
Depositado Por: | Tamara Cárcamo |
Depositado En: | 19 Sep 2022 11:36 |
Última Modificación: | 19 Sep 2022 11:36 |
Personal del repositorio solamente: página de control del documento