Optimal control is a mathematical field that is concerned with control policies that can be deduced using optimization algorithms. The optimal control approach to robust control design differs from conventional direct approaches to robust control that are more commonly discussed by firstly translating the robust control problem into its optimal control counterpart, and then solving the optimal control problem.
Robust Control Design: An Optimal Control Approach offers a complete presentation of this approach to robust control design, presenting modern control theory in an concise manner. The other two major approaches to robust control design, the H_infinite approach and the Kharitonov approach, are also covered and described in the simplest terms possible, in order to provide a complete overview of the area. It includes up-to-date research, and offers both theoretical and practical applications that include flexible structures, robotics, and automotive and aircraft control.
Robust Control Design: An Optimal Control Approach will be of interest to those needing an introductory textbook on robust control theory, design and applications as well as graduate and postgraduate students involved in systems and control research. Practitioners will also find the applications presented useful when solving practical problems in the engineering field.
Keywords: SCIENCE / System Theory SCI064000