Roder, Mateusde Rosa, Gustavo Henriquede Albuquerque, Victor Hugo C.Rossi, Andre L. D.Papa, Joao P.2021-06-252021-06-252020-01-01IEEE Transactions on Emerging Topics in Computational Intelligence.2471-285Xhttp://hdl.handle.net/11449/205676Deep learning architectures have been widely fostered throughout the last years, being used in a wide range of applications, such as object recognition, image reconstruction, and signal processing. Nevertheless, such models suffer from a common problem known as overfitting, which limits the network from predicting unseen data effectively. Regularization approaches arise in an attempt to address such a shortcoming. Among them, one can refer to the well-known Dropout, which tackles the problem by randomly shutting down a set of neurons and their connections according to a certain probability. Therefore, this approach does not consider any additional knowledge to decide which units should be disconnected. In this paper, we propose an energy-based Dropout (E-Dropout) that makes conscious decisions whether a neuron should be dropped or not. Specifically, we design this regularization method by correlating neurons and the model’s energy as an importance level for further applying it to energy-based models, such as Restricted Boltzmann Machines (RBMs). The experimental results over several benchmark datasets revealed the proposed approach’s suitability compared to the traditional Dropout and the standard RBMs.engComputational modelingDropoutenergy-based dropoutImage reconstructionmachine learningMathematical modelNeuronsregularizationrestricted boltzmann machinesStandardsTask analysisTrainingEnergy-Based Dropout in Restricted Boltzmann Machines: Why Not Go RandomArtigo10.1109/TETCI.2020.30437642-s2.0-85098746260