Logo do repositório

Deep convolutional networks based on lightweight YOLOv8 to detect and estimate peanut losses from images in post-harvesting environments

dc.contributor.authorBrito Filho, Armando Lopes de [UNESP]
dc.contributor.authorMorlin Carneiro, Franciele
dc.contributor.authorCarreira, Vinicius dos Santos [UNESP]
dc.contributor.authorTedesco, Danilo
dc.contributor.authorCosta Souza, Jarlyson Brunno [UNESP]
dc.contributor.authorBarbosa Júnior, Marcelo Rodrigues
dc.contributor.authorSilva, Rouverson Pereira da [UNESP]
dc.contributor.institutionUniversidade Estadual Paulista (UNESP)
dc.contributor.institutionFederal Technological University of Paraná (UTFPR)
dc.contributor.institutionKansas State University
dc.contributor.institutionUniversity of Georgia
dc.date.accessioned2025-04-29T18:42:43Z
dc.date.issued2025-07-01
dc.description.abstractPeanut losses detection is key to monitor operational quality during mechanical harvesting. Current manual assessments faces practical limitations in the field, as they tend to be exhaustive, time-consuming, and susceptible to errors, especially after long work periods. Therefore, the main objective of this study was to develop an automated image processing framework to detect, count, and estimate peanut pod losses during the harvesting operation. We proposed a robust approach encompassing different environmental conditions and training detection algorithms, specifically based on lightweight YOLOv8 architecture, with images acquired with a mobile smartphone at six different times of the day (10 a.m., 11 a.m., 1 p.m., 2 p.m., 3 p.m., and 4 p.m.). The experimental results showed that detecting two-seed peanut pods was more effective than one-seed pods, with higher precision, recall, and mAP50 values. The best results for image acquisition were between 10 a.m. and 2 p.m. The study also compared manual and automated counting methods, revealing that the best scenarios for counting achieved an R2 above 0.80. Furthermore, georeferenced maps of peanut losses revealed significant spatial variability, providing critical insights for targeted interventions. These findings demonstrate the potential to enhance mechanized harvesting efficiency and lay the groundwork for future integration into fully automated systems. By incorporating this method into harvesting machinery, real-time monitoring and accurate loss quantification can be achieved, substantially reducing the need for labor-intensive manual assessments.en
dc.description.affiliationDepartment of Engineering and Mathematical Sciences School of Agricultural and Veterinarian Sciences São Paulo State University (Unesp) Jaboticabal
dc.description.affiliationFederal Technological University of Paraná (UTFPR), Paraná
dc.description.affiliationDepartment of Agronomy Kansas State University
dc.description.affiliationDepartment of Horticulture University of Georgia
dc.description.affiliationUnespDepartment of Engineering and Mathematical Sciences School of Agricultural and Veterinarian Sciences São Paulo State University (Unesp) Jaboticabal
dc.identifierhttp://dx.doi.org/10.1016/j.compag.2025.110282
dc.identifier.citationComputers and Electronics in Agriculture, v. 234.
dc.identifier.doi10.1016/j.compag.2025.110282
dc.identifier.issn0168-1699
dc.identifier.scopus2-s2.0-105000114909
dc.identifier.urihttps://hdl.handle.net/11449/299546
dc.language.isoeng
dc.relation.ispartofComputers and Electronics in Agriculture
dc.sourceScopus
dc.subjectArachis hypogaea L
dc.subjectObject detection
dc.subjectPeanut losses
dc.subjectSmart harvesting
dc.subjectYOLOv8
dc.titleDeep convolutional networks based on lightweight YOLOv8 to detect and estimate peanut losses from images in post-harvesting environmentsen
dc.typeArtigopt
dspace.entity.typePublication
relation.isOrgUnitOfPublication3d807254-e442-45e5-a80b-0f6bf3a26e48
relation.isOrgUnitOfPublication.latestForDiscovery3d807254-e442-45e5-a80b-0f6bf3a26e48
unesp.campusUniversidade Estadual Paulista (UNESP), Faculdade de Ciências Agrárias e Veterinárias, Jaboticabalpt

Arquivos