Discrete Control Systems establishes a basis for the analysis and design of discretized/quantized control systems for continuous physical systems. Beginning with the necessary mathematical foundations and system-model descriptions, the text moves on to derive a robust stability condition. To keep a practical perspective on the uncertain physical systems considered, most of the methods treated are carried out in the frequency domain.
As part of the design procedure, modified Nyquist–Hall and Nichols diagrams are presented and discretized proportional–integral–derivative control schemes are reconsidered. Schemes for model-reference feedback and discrete-type observers are proposed. Although single-loop feedback systems form the core of the text, some consideration is given to multiple loops and nonlinearities. The robust control performance and stability of interval systems (with multiple uncertainties) are outlined.
Finally, the monograph describes the relationship between feedback-control and discrete event systems. The nonlinear phenomena associated with practically important event-driven systems are elucidated. The dynamics and stability of finite-state and discrete-event systems are defined.
Academic researchers interested in the uses of discrete modelling and control of continuous systems will find Discrete Control Systems instructive. The inclusion of end-of-chapter problems also makes the book suitable for use in self study either by professional control engineers or graduate students supplementing a more formal regimen of learning.
Just click on START button on Telegram Bot