The main purpose of this book is to introduce the latest results of the author and some others on developments in the study of absolute stability of nonlinear control systems in recent years. The characteristics of these results are: theoretically, to give as many as possible necessary and sufficient conditions of absolute stability of various nonlinear control systems in applications, to derive simple enough and even constructive algebraic sufficient conditions from these theoretical necessary and sufficient conditions for use in practical work and in methodology.
This volume presents an overview of some recent developments on the absolute stability of nonlinear control systems
The contents are divided into six chapters as follows: Chapter 1 introduces the main tools and the principal results used in this book, such as Liapunov functions, K-class functions, Dini-derivatives, M-matrices and the principal theorems on global stability. Chapter 2 presents the absolute stability theory of autonomous control systems and the well-known Lurie problem. Chapter 3 gives some simple algebraic necessary and sufficient conditions for the absolute stability of several special control systems. Chapter 4 discusses nonautonomous and discrete control systems. Chapter 5 deals with the absolute stability of control systems with m nonlinear control terms. Chapter 6 devotes itself to the absolute stability of control systems described by functional differential equations.
The book concludes with a useful bibliography.
Applied mathematicians, and engineers whose work involves control systems.
Chapter 1 Principal Theorems on Global Stability
1. 1. Liapunov Functions and K-Class Functions
1. 2. Dini-Derivatives
1. 3. M-Matrices
1. 4. Principal Theorems on Global Stability
1. 5. Partial Global Stability
1. 6. Global Stability of Sets
1. 7. Nonautonomous Systems
1. 8. The Systems with Separable Variables
1. 9. Autonomous Systems with Generalized Separable Variables
1. 10. Nonautonomous Systems with Separable Variables
1. 11. Notes
Chapter 2. Autonomous Control Systems
2. 1. The Expression and Classification of the Problems
2. 2. Necessary and Sufficient Conditions for Absolute Stability
2. 3. The S-Method and Modified S-Method
2. 4. Direct Control Systems
2. 5. Indirect Control Systems
2. 6. Notes
Chapter 3. Special Control Systems
3. 1. The Second Order Direct Control Systems
3. 2. A Class of the Third Order Control Systems
3. 3. Special Direct Control Systems of the nth Order
3. 4. The First Canonical Form of Control Systems
3. 5. Critical Systems
3. 6. The Second Canonical Form of Control Systems
3. 7. Notes
Chapter 4. Nonautonomous and Discrete Control Systems
4. 1. Nonautonomous Systems
4. 2. The Systems with Separable Variables
4. 3. Direct Control Systems
4. 4. Indirect Control Systems
4. 5. The Systems with Rigid and Revolving Feedback
4. 6. Discrete Contro stems
4. 7. Notes
Chapter 5. Control Systems with m Nonlinear Control Terms
5. 1. Necessary and Sufficient Conditions for Absolute Stability
5. 2. Some Simple Sufficient Conditions for Absolute Stability
5. 3. Dirimination of Definite Sign for Lurie's Functions
5. 4 Particular Systems
5. 5. Nonautonomous Systems
5. 6. Notes
Chapter 6. Control Systems Described by FDE
6. 1. The Systems Described by RFDE
6. 2. Large-Scale Control Systems Described by RFDE
6. 3. The Systems Described by NFDE
6. 4. Control Systems in Hiibert Spaces
6. 5. Notes
Biblogaphy
Index