Du er ikke logget ind
Beskrivelse
'. . . eminently suitable as a text for an introductory course: the style is pleasant; the prerequisites are kept to a minimum . . . and the pace of the development is appropriate for most students at the senior or first year graduate level.' — American Mathematical MonthlyThe purpose of this text is to lay a broad foundation for an understanding of the problems of the calculus of variations and its many methods and techniques, and to prepare readers for the study of modern optimal control theory. The treatment is limited to a thorough discussion of single-integral problems in one or more unknown functions, where the integral is employed in the riemannian sense.The first three chapters deal with variational problems without constraints. Chapter 4 is a self-contained treatment of the homogeneous problem in the two-dimensional plane. In Chapter 5, the minimum principle of Pontryagin as it applies to optimal control problems of nonpredetermined duration, where the state variables satisfy an autonomous system of first-order equations, is developed to the extent possible by classical means within the general framework of the Hamilton-Jacobi theory. Chapter 6 is devoted to a derivation of the multiplier rule for the problem of Mayer with fixed and variable endpoints and its application to the problem of Lagrange and the isoperimetric problem. In the last chapter, Legendre's necessary condition for a weak relative minimum and a sufficient condition for a weak relative minimum are derived within the framework of the theory of the second variation.This book, which includes many strategically placed problems and over 400 exercises, is directed to advanced undergraduate and graduate students with a background in advanced calculus and intermediate differential equations, and is adaptable to either a one- or two-semester course on the subject.