Energy-Based Dropout in Restricted Boltzmann Machines: Why Not Go Random

Nenhuma Miniatura disponível

Data

2020-01-01

Autores

Roder, Mateus
de Rosa, Gustavo Henrique
de Albuquerque, Victor Hugo C.
Rossi, Andre L. D.
Papa, Joao P.

Título da Revista

ISSN da Revista

Título de Volume

Editor

Resumo

Deep learning architectures have been widely fostered throughout the last years, being used in a wide range of applications, such as object recognition, image reconstruction, and signal processing. Nevertheless, such models suffer from a common problem known as overfitting, which limits the network from predicting unseen data effectively. Regularization approaches arise in an attempt to address such a shortcoming. Among them, one can refer to the well-known Dropout, which tackles the problem by randomly shutting down a set of neurons and their connections according to a certain probability. Therefore, this approach does not consider any additional knowledge to decide which units should be disconnected. In this paper, we propose an energy-based Dropout (E-Dropout) that makes conscious decisions whether a neuron should be dropped or not. Specifically, we design this regularization method by correlating neurons and the model’s energy as an importance level for further applying it to energy-based models, such as Restricted Boltzmann Machines (RBMs). The experimental results over several benchmark datasets revealed the proposed approach’s suitability compared to the traditional Dropout and the standard RBMs.

Descrição

Palavras-chave

Computational modeling, Dropout, energy-based dropout, Image reconstruction, machine learning, Mathematical model, Neurons, regularization, restricted boltzmann machines, Standards, Task analysis, Training

Como citar

IEEE Transactions on Emerging Topics in Computational Intelligence.

Coleções