Infinite Dimensional Optimization and Control Theory

Infinite Dimensional Optimization and Control Theory
Author: Hector O. Fattorini
Publisher: Cambridge University Press
Total Pages: 828
Release: 1999-03-28
Genre: Computers
ISBN: 9780521451253

Treats optimal problems for systems described by ODEs and PDEs, using an approach that unifies finite and infinite dimensional nonlinear programming.

Optimal Control Theory for Infinite Dimensional Systems

Optimal Control Theory for Infinite Dimensional Systems
Author: Xungjing Li
Publisher: Springer Science & Business Media
Total Pages: 462
Release: 2012-12-06
Genre: Mathematics
ISBN: 1461242606

Infinite dimensional systems can be used to describe many phenomena in the real world. As is well known, heat conduction, properties of elastic plastic material, fluid dynamics, diffusion-reaction processes, etc., all lie within this area. The object that we are studying (temperature, displace ment, concentration, velocity, etc.) is usually referred to as the state. We are interested in the case where the state satisfies proper differential equa tions that are derived from certain physical laws, such as Newton's law, Fourier's law etc. The space in which the state exists is called the state space, and the equation that the state satisfies is called the state equation. By an infinite dimensional system we mean one whose corresponding state space is infinite dimensional. In particular, we are interested in the case where the state equation is one of the following types: partial differential equation, functional differential equation, integro-differential equation, or abstract evolution equation. The case in which the state equation is being a stochastic differential equation is also an infinite dimensional problem, but we will not discuss such a case in this book.

Stochastic Optimal Control in Infinite Dimension

Stochastic Optimal Control in Infinite Dimension
Author: Giorgio Fabbri
Publisher: Springer
Total Pages: 928
Release: 2017-06-22
Genre: Mathematics
ISBN: 3319530674

Providing an introduction to stochastic optimal control in infinite dimension, this book gives a complete account of the theory of second-order HJB equations in infinite-dimensional Hilbert spaces, focusing on its applicability to associated stochastic optimal control problems. It features a general introduction to optimal stochastic control, including basic results (e.g. the dynamic programming principle) with proofs, and provides examples of applications. A complete and up-to-date exposition of the existing theory of viscosity solutions and regular solutions of second-order HJB equations in Hilbert spaces is given, together with an extensive survey of other methods, with a full bibliography. In particular, Chapter 6, written by M. Fuhrman and G. Tessitore, surveys the theory of regular solutions of HJB equations arising in infinite-dimensional stochastic control, via BSDEs. The book is of interest to both pure and applied researchers working in the control theory of stochastic PDEs, and in PDEs in infinite dimension. Readers from other fields who want to learn the basic theory will also find it useful. The prerequisites are: standard functional analysis, the theory of semigroups of operators and its use in the study of PDEs, some knowledge of the dynamic programming approach to stochastic optimal control problems in finite dimension, and the basics of stochastic analysis and stochastic equations in infinite-dimensional spaces.

Infinite-Dimensional Optimization and Convexity

Infinite-Dimensional Optimization and Convexity
Author: Ivar Ekeland
Publisher: University of Chicago Press
Total Pages: 175
Release: 1983-09-15
Genre: Business & Economics
ISBN: 0226199886

The caratheodory approach; Infinite-dimensional optimization; Duality theory.

Robust Control of Infinite Dimensional Systems

Robust Control of Infinite Dimensional Systems
Author: Ciprian Foias
Publisher: Springer
Total Pages: 238
Release: 1995-12
Genre: Technology & Engineering
ISBN:

Since its inception, H( optimization theory has become the control methodology of choice in robust feedback analysis and design. This monograph presents an operator theoretic approach to the H( control for disturbed parameter systems, that is, systems which admit infinite dimensional state spaces.

Optimization by Vector Space Methods

Optimization by Vector Space Methods
Author: David G. Luenberger
Publisher: John Wiley & Sons
Total Pages: 348
Release: 1997-01-23
Genre: Technology & Engineering
ISBN: 9780471181170

Engineers must make decisions regarding the distribution of expensive resources in a manner that will be economically beneficial. This problem can be realistically formulated and logically analyzed with optimization theory. This book shows engineers how to use optimization theory to solve complex problems. Unifies the large field of optimization with a few geometric principles. Covers functional analysis with a minimum of mathematics. Contains problems that relate to the applications in the book.

Nonlinear Optimal Control Theory

Nonlinear Optimal Control Theory
Author: Leonard David Berkovitz
Publisher: CRC Press
Total Pages: 394
Release: 2012-08-25
Genre: Mathematics
ISBN: 1466560266

Nonlinear Optimal Control Theory presents a deep, wide-ranging introduction to the mathematical theory of the optimal control of processes governed by ordinary differential equations and certain types of differential equations with memory. Many examples illustrate the mathematical issues that need to be addressed when using optimal control techniques in diverse areas. Drawing on classroom-tested material from Purdue University and North Carolina State University, the book gives a unified account of bounded state problems governed by ordinary, integrodifferential, and delay systems. It also discusses Hamilton-Jacobi theory. By providing a sufficient and rigorous treatment of finite dimensional control problems, the book equips readers with the foundation to deal with other types of control problems, such as those governed by stochastic differential equations, partial differential equations, and differential games.

Calculus of Variations and Optimal Control Theory

Calculus of Variations and Optimal Control Theory
Author: Daniel Liberzon
Publisher: Princeton University Press
Total Pages: 255
Release: 2012
Genre: Mathematics
ISBN: 0691151873

This textbook offers a concise yet rigorous introduction to calculus of variations and optimal control theory, and is a self-contained resource for graduate students in engineering, applied mathematics, and related subjects. Designed specifically for a one-semester course, the book begins with calculus of variations, preparing the ground for optimal control. It then gives a complete proof of the maximum principle and covers key topics such as the Hamilton-Jacobi-Bellman theory of dynamic programming and linear-quadratic optimal control. Calculus of Variations and Optimal Control Theory also traces the historical development of the subject and features numerous exercises, notes and references at the end of each chapter, and suggestions for further study. Offers a concise yet rigorous introduction Requires limited background in control theory or advanced mathematics Provides a complete proof of the maximum principle Uses consistent notation in the exposition of classical and modern topics Traces the historical development of the subject Solutions manual (available only to teachers) Leading universities that have adopted this book include: University of Illinois at Urbana-Champaign ECE 553: Optimum Control Systems Georgia Institute of Technology ECE 6553: Optimal Control and Optimization University of Pennsylvania ESE 680: Optimal Control Theory University of Notre Dame EE 60565: Optimal Control

Mathematical Control Theory

Mathematical Control Theory
Author: Eduardo D. Sontag
Publisher: Springer Science & Business Media
Total Pages: 543
Release: 2013-11-21
Genre: Mathematics
ISBN: 1461205778

Geared primarily to an audience consisting of mathematically advanced undergraduate or beginning graduate students, this text may additionally be used by engineering students interested in a rigorous, proof-oriented systems course that goes beyond the classical frequency-domain material and more applied courses. The minimal mathematical background required is a working knowledge of linear algebra and differential equations. The book covers what constitutes the common core of control theory and is unique in its emphasis on foundational aspects. While covering a wide range of topics written in a standard theorem/proof style, it also develops the necessary techniques from scratch. In this second edition, new chapters and sections have been added, dealing with time optimal control of linear systems, variational and numerical approaches to nonlinear control, nonlinear controllability via Lie-algebraic methods, and controllability of recurrent nets and of linear systems with bounded controls.