To tune or not to tune: Recommending when to adjust SVM hyper-parameters via meta-learning

Nenhuma Miniatura disponível

Data

2015-09-28

Autores

Mantovani, Rafael G.
Rossi, André L. D. [UNESP]
Vanschoren, Joaquin
Bischl, Bernd
Carvalho, André C.P.L.F.

Título da Revista

ISSN da Revista

Título de Volume

Editor

Resumo

Many classification algorithms, such as Neural Networks and Support Vector Machines, have a range of hyper-parameters that may strongly affect the predictive performance of the models induced by them. Hence, it is recommended to define the values of these hyper-parameters using optimization techniques. While these techniques usually converge to a good set of values, they typically have a high computational cost, because many candidate sets of values are evaluated during the optimization process. It is often not clear whether this will result in parameter settings that are significantly better than the default settings. When training time is limited, it may help to know when these parameters should definitely be tuned. In this study, we use meta-learning to predict when optimization techniques are expected to lead to models whose predictive performance is better than those obtained by using default parameter settings. Hence, we can choose to employ optimization techniques only when they are expected to improve performance, thus reducing the overall computational cost. We evaluate these meta-learning techniques on more than one hundred data sets. The experimental results show that it is possible to accurately predict when optimization techniques should be used instead of default values suggested by some machine learning libraries.

Descrição

Palavras-chave

Computational modeling, Nickel, Niobium, Optimization, Radio frequency, Support vector machines, Training

Como citar

Proceedings of the International Joint Conference on Neural Networks, v. 2015-September.

Coleções