Optimal Control Theory

Optimal Control Theory
Author: Suresh P. Sethi
Publisher: Taylor & Francis US
Total Pages: 536
Release: 2006
Genre: Business & Economics
ISBN: 9780387280929

Optimal control methods are used to determine optimal ways to control a dynamic system. The theoretical work in this field serves as a foundation for the book, which the authors have applied to business management problems developed from their research and classroom instruction. Sethi and Thompson have provided management science and economics communities with a thoroughly revised edition of their classic text on Optimal Control Theory. The new edition has been completely refined with careful attention to the text and graphic material presentation. Chapters cover a range of topics including finance, production and inventory problems, marketing problems, machine maintenance and replacement, problems of optimal consumption of natural resources, and applications of control theory to economics. The book contains new results that were not available when the first edition was published, as well as an expansion of the material on stochastic optimal control theory.

Optimal Control Theory and Static Optimization in Economics

Optimal Control Theory and Static Optimization in Economics
Author: Daniel LĂ©onard
Publisher: Cambridge University Press
Total Pages: 372
Release: 1992-01-31
Genre: Business & Economics
ISBN: 9780521337465

Optimal control theory is a technique being used increasingly by academic economists to study problems involving optimal decisions in a multi-period framework. This textbook is designed to make the difficult subject of optimal control theory easily accessible to economists while at the same time maintaining rigour. Economic intuitions are emphasized, and examples and problem sets covering a wide range of applications in economics are provided to assist in the learning process. Theorems are clearly stated and their proofs are carefully explained. The development of the text is gradual and fully integrated, beginning with simple formulations and progressing to advanced topics such as control parameters, jumps in state variables, and bounded state space. For greater economy and elegance, optimal control theory is introduced directly, without recourse to the calculus of variations. The connection with the latter and with dynamic programming is explained in a separate chapter. A second purpose of the book is to draw the parallel between optimal control theory and static optimization. Chapter 1 provides an extensive treatment of constrained and unconstrained maximization, with emphasis on economic insight and applications. Starting from basic concepts, it derives and explains important results, including the envelope theorem and the method of comparative statics. This chapter may be used for a course in static optimization. The book is largely self-contained. No previous knowledge of differential equations is required.

Optimal Control Theory

Optimal Control Theory
Author: Suresh P. Sethi
Publisher: Springer Nature
Total Pages: 520
Release: 2022-01-03
Genre: Business & Economics
ISBN: 3030917452

This new 4th edition offers an introduction to optimal control theory and its diverse applications in management science and economics. It introduces students to the concept of the maximum principle in continuous (as well as discrete) time by combining dynamic programming and Kuhn-Tucker theory. While some mathematical background is needed, the emphasis of the book is not on mathematical rigor, but on modeling realistic situations encountered in business and economics. It applies optimal control theory to the functional areas of management including finance, production and marketing, as well as the economics of growth and of natural resources. In addition, it features material on stochastic Nash and Stackelberg differential games and an adverse selection model in the principal-agent framework. Exercises are included in each chapter, while the answers to selected exercises help deepen readers’ understanding of the material covered. Also included are appendices of supplementary material on the solution of differential equations, the calculus of variations and its ties to the maximum principle, and special topics including the Kalman filter, certainty equivalence, singular control, a global saddle point theorem, Sethi-Skiba points, and distributed parameter systems. Optimal control methods are used to determine optimal ways to control a dynamic system. The theoretical work in this field serves as the foundation for the book, in which the author applies it to business management problems developed from his own research and classroom instruction. The new edition has been refined and updated, making it a valuable resource for graduate courses on applied optimal control theory, but also for financial and industrial engineers, economists, and operational researchers interested in applying dynamic optimization in their fields.

Introductory Optimization Dynamics

Introductory Optimization Dynamics
Author: P.N.V. Tu
Publisher: Springer Science & Business Media
Total Pages: 401
Release: 2013-11-11
Genre: Business & Economics
ISBN: 3662007193

Optimal Control theory has been increasingly used in Economi- and Management Science in the last fifteen years or so. It is now commonplace, even at textbook level. It has been applied to a great many areas of Economics and Management Science, such as Optimal Growth, Optimal Population, Pollution control, Natural Resources, Bioeconomics, Education, International Trade, Monopoly, Oligopoly and Duopoly, Urban and Regional Economics, Arms Race control, Business Finance, Inventory Planning, Marketing, Maintenance and Replacement policy and many others. It is a powerful tool of dynamic optimization. There is no doubt social sciences students should be familiar with this tool, if not for their own research, at least for reading the literature. These Lecture Notes attempt to provide a plain exposition of Optimal Control Theory, with a number of economic examples and applications designed mainly to illustrate the various techniques and point out the wide range of possible applications rather than to treat exhaustively any area of economic theory or policy. Chapters 2,3 and 4 are devoted to the Calculus of Variations, Chapter 5 develops Optimal Control theory from the Variational approach, Chapter 6 deals with the problems of constrained state and control variables , Chapter 7, with Linear Control models and Chapter 8, with stabilization models. Discrete systems are discussed in Chapter 9 and Sensitivity analysis in Chapter 10. Chapter 11 presents a wide range of Economics and Management Science applications.

Optimal Control and Dynamic Games

Optimal Control and Dynamic Games
Author: Christophe Deissenberg
Publisher: Springer Science & Business Media
Total Pages: 351
Release: 2005-11-03
Genre: Business & Economics
ISBN: 0387258051

Optimal Control and Dynamic Games has been edited to honor the outstanding contributions of Professor Suresh Sethi in the fields of Applied Optimal Control. Professor Sethi is internationally one of the foremost experts in this field. He is, among others, co-author of the popular textbook "Sethi and Thompson: Optimal Control Theory: Applications to Management Science and Economics". The book consists of a collection of essays by some of the best known scientists in the field, covering diverse aspects of applications of optimal control and dynamic games to problems in Finance, Management Science, Economics, and Operations Research. In doing so, it provides both a state-of-the-art overview over recent developments in the field, and a reference work covering the wide variety of contemporary questions that can be addressed with optimal control tools, and demonstrates the fruitfulness of the methodology.

Optimal Control

Optimal Control
Author: Michael Athans
Publisher: Courier Corporation
Total Pages: 900
Release: 2013-04-26
Genre: Technology & Engineering
ISBN: 0486318184

Geared toward advanced undergraduate and graduate engineering students, this text introduces the theory and applications of optimal control. It serves as a bridge to the technical literature, enabling students to evaluate the implications of theoretical control work, and to judge the merits of papers on the subject. Rather than presenting an exhaustive treatise, Optimal Control offers a detailed introduction that fosters careful thinking and disciplined intuition. It develops the basic mathematical background, with a coherent formulation of the control problem and discussions of the necessary conditions for optimality based on the maximum principle of Pontryagin. In-depth examinations cover applications of the theory to minimum time, minimum fuel, and to quadratic criteria problems. The structure, properties, and engineering realizations of several optimal feedback control systems also receive attention. Special features include numerous specific problems, carried through to engineering realization in block diagram form. The text treats almost all current examples of control problems that permit analytic solutions, and its unified approach makes frequent use of geometric ideas to encourage students' intuition.