000 02196nam a22002177a 4500
999 _c1383
_d1383
005 20210928141157.0
008 210928b ||||| |||| 00| 0 eng d
020 _a9788120346826
082 _a519.23
_bSPE
100 _aSpeyer, Jason L
_93832
245 _aStochastic processes, estimation, and control
260 _bPHI Learning Pvt. Ltd.
_aNew Delhi
_c2013
300 _axiv, 383 p.
365 _aINR
_b350.00
520 _aUncertainty and risk are integral to engineering because real systems have inherent ambiguities that arise naturally or due to our inability to model complex physics. The authors discuss probability theory, stochastic processes, estimation, and stochastic control strategies and show how probability can be used to model uncertainty in control and estimation problems. The material is practical and rich in research opportunities. The authors provide a comprehensive treatment of stochastic systems from the foundations of probability to stochastic optimal control. The book covers discrete- and continuous-time stochastic dynamic systems leading to the derivation of the Kalman filter, its properties, and its relation to the frequency domain Wiener filter as well as the dynamic programming derivation of the linear quadratic Gaussian (LQG) and the linear exponential Gaussian (LEG) controllers and their relation to H2 and H-inf controllers and system robustness. Stochastic Processes, Estimation, and Control is divided into three related sections. First, the authors present the concepts of probability theory, random variables, and stochastic processes, which lead to the topics of expectation, conditional expectation, and discrete-time estimation and the Kalman filter. After establishing this foundation, stochastic calculus and continuous-time estimation are introduced. Finally, dynamic programming for both discrete-time and continuous-time systems leads to the solution of optimal stochastic control problems, resulting in controllers with significant practical application
650 _aStochastic processes
_9814
650 _aControl theory
_93833
650 _aEstimation theory
_93834
700 _aChung, Walter H.
_93835
942 _2ddc
_cBK