Fine-Tuning Dropout Regularization in Energy-Based Deep Learning
Nenhuma Miniatura disponível
Data
2021-01-01
Orientador
Coorientador
Pós-graduação
Curso de graduação
Título da Revista
ISSN da Revista
Título de Volume
Editor
Tipo
Trabalho apresentado em evento
Direito de acesso
Resumo
Deep Learning architectures have been extensively studied in the last years, mainly due to their discriminative power in Computer Vision. However, one problem related to such models concerns their number of parameters and hyperparameters, which can easily reach hundreds of thousands. Additional drawbacks consist of their need for extensive training datasets and their high probability of overfitting. Recently, a naïve idea of disconnecting neurons from a network, known as Dropout, has shown to be a promising solution though it requires an adequate hyperparameter setting. Therefore, this work addresses finding suitable Dropout ratios through meta-heuristic optimization in the task of image reconstruction. Several energy-based Deep Learning architectures, such as Restricted Boltzmann Machines, Deep Belief Networks, and several meta-heuristic techniques, such as Particle Swarm Optimization, Bat Algorithm, Firefly Algorithm, Cuckoo Search, were employed in such a context. The experimental results describe the feasibility of using meta-heuristic optimization to find suitable Dropout parameters in three literature datasets and reinforce bio-inspired optimization as an alternative to empirically choosing regularization-based hyperparameters.
Descrição
Palavras-chave
Idioma
Inglês
Como citar
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), v. 12702 LNCS, p. 99-108.