Logotipo do repositório
 

Publicação:
A convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imagery

dc.contributor.authorOsco, Lucas Prado
dc.contributor.authorArruda, Mauro dos Santos de
dc.contributor.authorMarcato Junior, Jose
dc.contributor.authorSilva, Neemias Buceli da
dc.contributor.authorMarques Ramos, Ana Paula
dc.contributor.authorSaito Moryia, Erika Akemi
dc.contributor.authorImai, Nilton Nobuhiro
dc.contributor.authorPereira, Danillo Roberto
dc.contributor.authorCreste, Jose Eduardo
dc.contributor.authorMatsubara, Edson Takashi
dc.contributor.authorLi, Jonathan
dc.contributor.authorGoncalves, Wesley Nunes
dc.contributor.institutionUniversidade Federal de Mato Grosso do Sul (UFMS)
dc.contributor.institutionUniv Western Sao Paulo
dc.contributor.institutionSoo Paulo State Univ
dc.contributor.institutionUniv Waterloo
dc.contributor.institutionUniversidade Estadual Paulista (Unesp)
dc.date.accessioned2020-12-10T20:13:32Z
dc.date.available2020-12-10T20:13:32Z
dc.date.issued2020-02-01
dc.description.abstractVisual inspection has been a common practice to determine the number of plants in orchards, which is a labor-intensive and time-consuming task. Deep learning algorithms have demonstrated great potential for counting plants on unmanned aerial vehicle (UAV)-borne sensor imagery. This paper presents a convolutional neural network (CNN) approach to address the challenge of estimating the number of citrus trees in highly dense orchards from UAV multispectral images. The method estimates a dense map with the confidence that a plant occurs in each pixel. A flight was conducted over an orchard of Valencia-orange trees planted in linear fashion, using a multispectral camera with four bands in green, red, red-edge and near-infrared. The approach was assessed considering the individual bands and their combinations. A total of 37,353 trees were adopted in point feature to evaluate the method. A variation of a (0.5; 1.0 and 1.5) was used to generate different ground truth confidence maps. Different stages (T) were also used to refine the confidence map predicted. To evaluate the robustness of our method, we compared it with two state-of-the-art object detection CNN methods (Faster R-CNN and RetinaNet). The results show better performance with the combination of green, red and near-infrared bands, achieving a Mean Absolute Error (MAE), Mean Square Error (MSE), R-2 and Normalized Root-MeanSquared Error (NRMSE) of 2.28, 9.82, 0.96 and 0.05, respectively. This band combination, when adopting sigma = 1 and a stage (T = 8), resulted in an R-2, MAE, Precision, Recall and F1 of 0.97, 2.05, 0.95, 0.96 and 0.95, respectively. Our method outperforms significantly object detection methods for counting and geolocation. It was concluded that our CNN approach developed to estimate the number and geolocation of citrus trees in highdensity orchards is satisfactory and is an effective strategy to replace the traditional visual inspection method to determine the number of plants in orchards trees.en
dc.description.affiliationUniv Fed Mato Grosso do Sul, Fac Engn Architecture & Urbanism & Geog, Campo Grande, MS, Brazil
dc.description.affiliationUniv Fed Mato Grosso do Sul, Fac Comp Sci, Campo Grande, MS, Brazil
dc.description.affiliationUniv Western Sao Paulo, Fac Agron, Sao Paulo, Brazil
dc.description.affiliationUniv Western Sao Paulo, Fac Engn & Architecture, Sao Paulo, Brazil
dc.description.affiliationSoo Paulo State Univ, Dept Cartog Sci, BR-19060900 Presidente Prudente, SP, Brazil
dc.description.affiliationUniv Western Sao Paulo, Fac Comp Sci, Sao Paulo, Brazil
dc.description.affiliationUniv Waterloo, Dept Geog & Environm Management, Waterloo, ON N2L 3G1, Canada
dc.description.affiliationUniv Waterloo, Dept Syst Design Engn, Waterloo, ON N2L 3G1, Canada
dc.description.sponsorshipConselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
dc.description.sponsorshipCoordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
dc.description.sponsorshipFundect
dc.description.sponsorshipIdCNPq: 433783/2018-4
dc.description.sponsorshipIdCNPq: 304173/2016-9
dc.description.sponsorshipIdCAPES: 88881.311850/2018-01
dc.description.sponsorshipIdFundect: 59/300.066/2015
dc.format.extent97-106
dc.identifierhttp://dx.doi.org/10.1016/j.isprsjprs.2019.12.010
dc.identifier.citationIsprs Journal Of Photogrammetry And Remote Sensing. Amsterdam: Elsevier, v. 160, p. 97-106, 2020.
dc.identifier.doi10.1016/j.isprsjprs.2019.12.010
dc.identifier.issn0924-2716
dc.identifier.lattes2985771102505330
dc.identifier.orcid0000-0003-0516-0567
dc.identifier.urihttp://hdl.handle.net/11449/197328
dc.identifier.wosWOS:000510525500007
dc.language.isoeng
dc.publisherElsevier B.V.
dc.relation.ispartofIsprs Journal Of Photogrammetry And Remote Sensing
dc.sourceWeb of Science
dc.subjectDeep learning
dc.subjectMultispectral image
dc.subjectUAV-borne sensor
dc.subjectObject detection
dc.subjectCitrus tree counting
dc.subjectOrchard
dc.titleA convolutional neural network approach for counting and geolocating citrus-trees in UAV multispectral imageryen
dc.typeArtigo
dcterms.licensehttp://www.elsevier.com/about/open-access/open-access-policies/article-posting-policy
dcterms.rightsHolderElsevier B.V.
dspace.entity.typePublication
unesp.advisor.lattes2985771102505330[7]
unesp.advisor.orcid0000-0003-0516-0567[7]
unesp.author.orcid0000-0002-0258-536X[1]
unesp.author.orcid0000-0001-6633-2903[5]
unesp.departmentCartografia - FCTpt

Arquivos