Passos, Leandro Aparecido [UNESP]Rosa, Gustavo Henrique de [UNESP]Rodrigues, DouglasRoder, Mateus [UNESP]Papa, João Paulo [UNESP]2022-04-302022-04-302020-01-01Natural Computing Series, p. 67-96.1619-7127http://hdl.handle.net/11449/233002Machine learning techniques are capable of talking, interpreting, creating, and even reasoning about virtually any subject. Also, their learning power has grown exponentially throughout the last years due to advances in hardware architecture. Nevertheless, most of these models still struggle regarding their practical usage since they require a proper selection of hyper-parameters, which are often empirically chosen. Such requirements are strengthened when concerning deep learning models, which commonly require a higher number of hyper-parameters. A collection of nature-inspired optimization techniques, known as meta-heuristics, arise as straightforward solutions to tackle such problems since they do not employ derivatives, thus alleviating their computational burden. Therefore, this work proposes a comparison among several meta-heuristic optimization techniques in the context of Deep Belief Networks hyper-parameter fine-tuning. An experimental setup was conducted over three public datasets in the task of binary image reconstruction and demonstrated consistent results, posing meta-heuristic techniques as a suitable alternative to the problem.67-96engOn the Assessment of Nature-Inspired Meta-Heuristic Optimization Techniques to Fine-Tune Deep Belief NetworksCapítulo de livro10.1007/978-981-15-3685-4_32-s2.0-85086100220