Optimal Control Theory: An Introduction


Price:
Sale price$27.95

Description

Optimal control theory is the science of maximizing the returns from and minimizing the costs of the operation of physical, social, and economic processes. Geared toward upper-level undergraduates, this text introduces three aspects of optimal control theory: dynamic programming, Pontryagin's minimum principle, and numerical techniques for trajectory optimization.
Chapters 1 and 2 focus on describing systems and evaluating their performances. Chapter 3 deals with dynamic programming. The calculus of variations and Pontryagin's minimum principle are the subjects of chapters 4 and 5, and chapter 6 examines iterative numerical techniques for finding optimal controls and trajectories. Numerous problems, intended to introduce additional topics as well as to illustrate basic concepts, appear throughout the text.



Author: Donald E. Kirk
Publisher: Dover Publications
Published: 04/30/2004
Pages: 452
Binding Type: Paperback
Weight: 1.20lbs
Size: 8.30h x 5.40w x 1.00d
ISBN13: 9780486434841
ISBN10: 0486434842
BISAC Categories:
- Technology & Engineering | Electronics | General
- Computers | Cybernetics

This title is not returnable