Fine-tuning enhanced probabilistic neural networks using metaheuristic-driven optimization
MetadataShow full item record
Many approaches using neural networks have been studied in the past years. A number of architectures for different objectives are presented in the literature, including probabilistic neural networks (PNNs), which have shown good results in several applications. A simple and elegant solution related to PNNs is the enhanced probabilistic neural networks (EPNNs), whose idea is to consider only the samples that fall in a neighborhood of given a training sample to estimate its probability density function. In this work, we propose to fine-tune EPNN parameters by means of metaheuristic-driven optimization techniques, from the results evaluated in a number of public datasets.