Atenção!


O atendimento às questões referentes ao Repositório Institucional será interrompido entre os dias 20 de dezembro de 2025 a 4 de janeiro de 2026.

Pedimos a sua compreensão e aproveitamos para desejar boas festas!

Logo do repositório

A photogrammetric approach for real-time visual SLAM applied to an omnidirectional system

dc.contributor.authorGarcia, Thaisa Aline Correia [UNESP]
dc.contributor.authorTommaselli, Antonio Maria Garcia [UNESP]
dc.contributor.authorCastanheiro, Letícia Ferrari [UNESP]
dc.contributor.authorCampos, Mariana Batista
dc.contributor.institutionUniversidade Estadual Paulista (UNESP)
dc.contributor.institutionFinnish Geospatial Research Institute—FGI
dc.date.accessioned2025-04-29T20:10:07Z
dc.date.issued2024-09-01
dc.description.abstractThe problem of sequential estimation of the exterior orientation of imaging sensors and the three-dimensional environment reconstruction in real time is commonly known as visual simultaneous localisation and mapping (vSLAM). Omnidirectional optical sensors have been increasingly used in vSLAM solutions, mainly for providing a wider view of the scene, allowing the extraction of more features. However, dealing with unmodelled points in the hyperhemispherical field poses challenges, mainly due to the complex lens geometry entailed in the image formation process. To address these challenges, the use of rigorous photogrammetric models that appropriately handle the geometry of fisheye lens cameras can overcome these challenges. Thus, this study presents a real-time vSLAM approach for omnidirectional systems adapting ORB-SLAM with a rigorous projection model (equisolid-angle). The implementation was conducted on the Nvidia Jetson TX2 board, and the approach was evaluated using hyperhemispherical images captured by a dual-fisheye camera (Ricoh Theta S) embedded into a mobile backpack platform. The trajectory covered a distance of 140 m, with the approach demonstrating accuracy better than 0.12 m at the beginning and achieving metre-level accuracy at the end of the trajectory. Additionally, we compared the performance of our proposed approach with a generic model for fisheye lens cameras.en
dc.description.affiliationSão Paulo State University (Unesp)
dc.description.affiliationFinnish Geospatial Research Institute—FGI
dc.description.affiliationUnespSão Paulo State University (Unesp)
dc.description.sponsorshipFundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
dc.description.sponsorshipConselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
dc.description.sponsorshipCoordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
dc.description.sponsorshipIdFAPESP: 2021/06029-7
dc.description.sponsorshipIdCNPq: 303670/2018-5
dc.description.sponsorshipIdCAPES: 88887.463908/2019-00
dc.description.sponsorshipIdCAPES: 88887.695922/2022
dc.format.extent577-599
dc.identifierhttp://dx.doi.org/10.1111/phor.12494
dc.identifier.citationPhotogrammetric Record, v. 39, n. 187, p. 577-599, 2024.
dc.identifier.doi10.1111/phor.12494
dc.identifier.issn1477-9730
dc.identifier.issn0031-868X
dc.identifier.scopus2-s2.0-85192050588
dc.identifier.urihttps://hdl.handle.net/11449/307709
dc.language.isoeng
dc.relation.ispartofPhotogrammetric Record
dc.sourceScopus
dc.subjectbackpack systems
dc.subjectfisheye lenses
dc.subjectomnidirectional system
dc.subjectORB-SLAM
dc.subjectreal time
dc.subjectRicoh Theta S
dc.subjectSLAM
dc.titleA photogrammetric approach for real-time visual SLAM applied to an omnidirectional systemen
dc.typeArtigopt
dspace.entity.typePublication
unesp.author.orcid0000-0002-1540-6762[1]
unesp.author.orcid0000-0003-0483-1103[2]
unesp.author.orcid0000-0003-2940-5872[3]
unesp.author.orcid0000-0003-3430-7521[4]

Arquivos

Coleções