A maximum principle for infinite time asymptotically stable impulsive dynamic control systems
Nenhuma Miniatura disponível
Data
2010-12-01
Orientador
Coorientador
Pós-graduação
Curso de graduação
Título da Revista
ISSN da Revista
Título de Volume
Editor
Tipo
Trabalho apresentado em evento
Direito de acesso
Acesso aberto
Resumo
We consider an infinite horizon optimal impulsive control problems for which a given cost function is minimized by choosing control strategies driving the state to a point in a given closed set C ∞. We present necessary conditions of optimality in the form of a maximum principle for which the boundary condition of the adjoint variable is such that non-degeneracy due to the fact that the time horizon is infinite is ensured. These conditions are given for conventional systems in a first instance and then for impulsive control problems. They are proved by considering a family of approximating auxiliary interval conventional (without impulses) optimal control problems defined on an increasing sequence of finite time intervals. As far as we know, results of this kind have not been derived previously. © 2010 IFAC.
Descrição
Palavras-chave
Control, Maximum principle, Necessary conditions of optimality, Optimal stability, Adjoint variables, Asymptotically stable, Closed set, Control strategies, Conventional systems, Dynamic control systems, Finite time intervals, Impulsive controls, Infinite horizons, Infinite time, Non-degeneracy, Optimal control problem, Optimal impulsive control, Time horizons, Control theory, Nonlinear control systems, Optimization, Control system stability
Idioma
Inglês
Como citar
IFAC Proceedings Volumes (IFAC-PapersOnline), p. 1326-1331.