Studies the basic problems like observability, controllability, stability, Lyapunov stability, stabilizability and optimal control for dynamical systems represented by ordinary differential equations in a finite dimensional Euclidean space. The problems are also considered for nonlinear dynamical systems.
The contents of the book are so organized as to serve as an introductory level text helping to understand the basic ingredients of control theory. A good number of examples are provided to illustrate the concepts and each chapter is supplemented by a set of exercises for the benefit of the students. The prerequisites are elementary courses in analysis, differential equations and the theory of matrices.