Du er ikke logget ind
Beskrivelse
This book is devoted to an investigation of control problems which can be described by ordinary differential equations and be expressed in terms of game theoretical notions. In these terms, a strategy is a control based on the feedback principle which will assure a definite equality for the controlled process which is subject to uncertain factors such as a move or a controlling action of the opponent. Game Theoretical Control Problems contains definitions and formalizations of differential games, existence for equilibrium and extensive discussions of optimal strategies. Formal definitions and statements are accompanied by suitable motivations and discussions of computational algorithms. The book is addessed to mathematicians, engineers, economists and other users of control theoretical and game theoretical notions.