Passos Júnior, Leandro A.Costa, Kelton A. P. [UNESP]Papa, João P. [UNESP]2018-12-112018-12-112017-01-01Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), v. 10424 LNCS, p. 172-183.1611-33490302-9743http://hdl.handle.net/11449/179135Deep learning has been considered a hallmark in a number of applications recently. Among those techniques, the ones based on Restricted Boltzmann Machines have attracted a considerable attention, since they are energy-driven models composed of latent variables that aim at learning the probability distribution of the input data. In a nutshell, the training procedure of such models concerns the minimization of the energy of each training sample in order to increase its probability. Therefore, such optimization process needs to be regularized in order to reach the best trade-off between exploitation and exploration. In this work, we propose an adaptive regularization approach based on temperatures, and we show its advantages considering Deep Belief Networks (DBNs) and Deep Boltzmann Machines (DBMs). The proposed approach is evaluated in the context of binary image reconstruction, thus outperforming temperature-fixed DBNs and DBMs.172-183engDeep Boltzmann machines using adaptive temperaturesTrabalho apresentado em evento10.1007/978-3-319-64689-3_14Acesso aberto2-s2.0-85028518481