A New Approach to Learn Spatio-Spectral Texture Representation with Randomized Networks: Application to Brazilian Plant Species Identification
Carregando...
Arquivos
Fontes externas
Fontes externas
Data
Autores
Orientador
Coorientador
Pós-graduação
Curso de graduação
Título da Revista
ISSN da Revista
Título de Volume
Editor
Tipo
Trabalho apresentado em evento
Direito de acesso
Arquivos
Fontes externas
Fontes externas
Resumo
Texture and color are fundamental visual descriptors, each complementing the other. Although many approaches have been developed for color-texture analysis, they often lack spectral analysis of the image and suffer from limited data availability for training in various problems. This paper introduces a new single-parameter texture representation, which integrates spatial and spectral analyses by combining the weights of the output layers of randomized autoencoders applied on both the same and adjacent image channels. As our approach is not end-to-end, we can extract individual representations for each image independently of the dataset size and without the need of fine-tuning. The rationale behind this approach is to learn meaningful spatial and spectral information of color-texture images through a simple neural network architecture. The proposed representation was evaluated using four benchmark datasets: Outex, USPtex, 1200Tex and MBT. We also verify the performance of the proposed representation on a practical and challenging task of Brazilian plant species identification. The experiments reveal that our method has a competitive classification accuracy in both scenarios when compared to the other methods, including various complex deep learning architectures. This shows an important contribution to the color-texture analysis and serves as a useful resource for other areas of computer vision and pattern recognition.
Descrição
Palavras-chave
Color-texture, Randomized neural network, Representation learning
Idioma
Inglês
Citação
Communications in Computer and Information Science, v. 2141 CCIS, p. 435-449.





