In the past few decades, there have been remarkable advances in the field of systems and control theory thanks to the unprecedented interaction between mathematics and the physical and engineering sciences. Recently, optimal control theory for dynamic systems driven by vector measures has attracted increasing interest. This book presents this theory for dynamic systems governed by both ordinary and stochastic differential equations, including extensive results on the existence of optimal controls and necessary conditions for optimality. Computational algorithms are developed based on the optimality conditions, with numerical results presented to demonstrate the applicability of the theoretical results developed in the book.
This book will be of interest to researchers in optimal control or applied functional analysis interested in applications of vector measures to control theory, stochastic systems driven by vector measures, and related topics. In particular, this self-contained account can be a starting point for further advances in the theory and applications of dynamic systems driven and controlled by vector measures.