Continuous Time Dynamical Systems: State Estimation and Optimal Control with Orthogonal Functions by B.M. Mohan and S.K. Kar
English | ISBN: 1466517298 | 2012 | PDF | 247 pages | 4,6 MB
Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved. An optimal control is a set of differential equations describing the paths of the control variables that minimize the cost functional.