Inductive Self-Supervised Dimensionality Reduction for Image Retrieval
Carregando...
Arquivos
Fontes externas
Fontes externas
Data
Orientador
Coorientador
Pós-graduação
Curso de graduação
Título da Revista
ISSN da Revista
Título de Volume
Editor
Tipo
Trabalho apresentado em evento
Direito de acesso
Arquivos
Fontes externas
Fontes externas
Resumo
The exponential growth of multimidia data creates a pressing need for approaches that are capable of efficiently handling Content-Based Image Retrieval (CBIR) in large and continuosly evolving datasets. Dimensionality reduction techniques, such as t-SNE and UMAP, have been widely used to transform high-dimensional features into more discriminative, low-dimensional representations. These transformations improve the effectiveness of retrieval systems by not only preserving but also enhancing the underlying structure of the data. However, their transductive nature requires access to the entire dataset during the reduction process, limiting their use in dynamic environments where data is constantly added. In this paper, we propose ISSDiR, a self-supervised, inductive dimensionality reduction method that generalizes to unseen data, offering a practical solution for continuously expanding datasets. Our approach integrates neural networks-based feature extraction with clustering-based pseudo-labels and introduces a hybrid loss function that combines cross-entropy and constrastive loss, weighted by cluster distances. Extensive experiments demonstrate the competitive performance of the proposed method in multiple datasets. This indicates its potential to contribute to the field of image retrieval by introducing a novel inductive approach specifically designed for dimensionality reduction in retrieval tasks.
Descrição
Palavras-chave
Content-Based Image Retrieval, Dimensionality Reduction, Neural Networks, Self-Supervised Learning
Idioma
Inglês
Citação
Proceedings of the International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, v. 2, p. 383-391.




