da Silva, I. N.Amaral, W. C.Arruda, L. V. R.2014-05-202014-05-202006-03-01Journal of Optimization Theory and Applications. New York: Springer/plenum Publishers, v. 128, n. 3, p. 563-580, 2006.0022-3239http://hdl.handle.net/11449/38376Neural networks consist of highly interconnected and parallel nonlinear processing elements that are shown to be extremely effective in computation. This paper presents an architecture of recurrent neural net-works that can be used to solve several classes of optimization problems. More specifically, a modified Hopfield network is developed and its inter-nal parameters are computed explicitly using the valid-subspace technique. These parameters guarantee the convergence of the network to the equilibrium points, which represent a solution of the problem considered. The problems that can be treated by the proposed approach include combinatorial optimiza-tion problems, dynamic programming problems, and nonlinear optimization problems.563-580engrecurrent neural networksnonlinear optimizationdynamic programmingcombinatorial optimizationHopfield networkNeural approach for solving several types of optimization problemsArtigo10.1007/s10957-006-9032-9WOS:000241554100005Acesso restrito