Markov Chains

Markov Chains
Author: Paul A. Gagniuc
Publisher: John Wiley & Sons
Total Pages: 252
Release: 2017-07-31
Genre: Mathematics
ISBN: 1119387558

A fascinating and instructive guide to Markov chains for experienced users and newcomers alike This unique guide to Markov chains approaches the subject along the four convergent lines of mathematics, implementation, simulation, and experimentation. It introduces readers to the art of stochastic modeling, shows how to design computer implementations, and provides extensive worked examples with case studies. Markov Chains: From Theory to Implementation and Experimentation begins with a general introduction to the history of probability theory in which the author uses quantifiable examples to illustrate how probability theory arrived at the concept of discrete-time and the Markov model from experiments involving independent variables. An introduction to simple stochastic matrices and transition probabilities is followed by a simulation of a two-state Markov chain. The notion of steady state is explored in connection with the long-run distribution behavior of the Markov chain. Predictions based on Markov chains with more than two states are examined, followed by a discussion of the notion of absorbing Markov chains. Also covered in detail are topics relating to the average time spent in a state, various chain configurations, and n-state Markov chain simulations used for verifying experiments involving various diagram configurations. • Fascinating historical notes shed light on the key ideas that led to the development of the Markov model and its variants • Various configurations of Markov Chains and their limitations are explored at length • Numerous examples—from basic to complex—are presented in a comparative manner using a variety of color graphics • All algorithms presented can be analyzed in either Visual Basic, Java Script, or PHP • Designed to be useful to professional statisticians as well as readers without extensive knowledge of probability theory Covering both the theory underlying the Markov model and an array of Markov chain implementations, within a common conceptual framework, Markov Chains: From Theory to Implementation and Experimentation is a stimulating introduction to and a valuable reference for those wishing to deepen their understanding of this extremely valuable statistical tool. Paul A. Gagniuc, PhD, is Associate Professor at Polytechnic University of Bucharest, Romania. He obtained his MS and his PhD in genetics at the University of Bucharest. Dr. Gagniuc’s work has been published in numerous high profile scientific journals, ranging from the Public Library of Science to BioMed Central and Nature journals. He is the recipient of several awards for exceptional scientific results and a highly active figure in the review process for different scientific areas.

Markov Chains and Decision Processes for Engineers and Managers

Markov Chains and Decision Processes for Engineers and Managers
Author: Theodore J. Sheskin
Publisher: CRC Press
Total Pages: 478
Release: 2016-04-19
Genre: Mathematics
ISBN: 1420051121

Recognized as a powerful tool for dealing with uncertainty, Markov modeling can enhance your ability to analyze complex production and service systems. However, most books on Markov chains or decision processes are often either highly theoretical, with few examples, or highly prescriptive, with little justification for the steps of the algorithms u

Markov Chains

Markov Chains
Author: Wai-Ki Ching
Publisher: Springer Science & Business Media
Total Pages: 259
Release: 2013-03-27
Genre: Business & Economics
ISBN: 1461463122

This new edition of Markov Chains: Models, Algorithms and Applications has been completely reformatted as a text, complete with end-of-chapter exercises, a new focus on management science, new applications of the models, and new examples with applications in financial risk management and modeling of financial data. This book consists of eight chapters. Chapter 1 gives a brief introduction to the classical theory on both discrete and continuous time Markov chains. The relationship between Markov chains of finite states and matrix theory will also be highlighted. Some classical iterative methods for solving linear systems will be introduced for finding the stationary distribution of a Markov chain. The chapter then covers the basic theories and algorithms for hidden Markov models (HMMs) and Markov decision processes (MDPs). Chapter 2 discusses the applications of continuous time Markov chains to model queueing systems and discrete time Markov chain for computing the PageRank, the ranking of websites on the Internet. Chapter 3 studies Markovian models for manufacturing and re-manufacturing systems and presents closed form solutions and fast numerical algorithms for solving the captured systems. In Chapter 4, the authors present a simple hidden Markov model (HMM) with fast numerical algorithms for estimating the model parameters. An application of the HMM for customer classification is also presented. Chapter 5 discusses Markov decision processes for customer lifetime values. Customer Lifetime Values (CLV) is an important concept and quantity in marketing management. The authors present an approach based on Markov decision processes for the calculation of CLV using real data. Chapter 6 considers higher-order Markov chain models, particularly a class of parsimonious higher-order Markov chain models. Efficient estimation methods for model parameters based on linear programming are presented. Contemporary research results on applications to demand predictions, inventory control and financial risk measurement are also presented. In Chapter 7, a class of parsimonious multivariate Markov models is introduced. Again, efficient estimation methods based on linear programming are presented. Applications to demand predictions, inventory control policy and modeling credit ratings data are discussed. Finally, Chapter 8 re-visits hidden Markov models, and the authors present a new class of hidden Markov models with efficient algorithms for estimating the model parameters. Applications to modeling interest rates, credit ratings and default data are discussed. This book is aimed at senior undergraduate students, postgraduate students, professionals, practitioners, and researchers in applied mathematics, computational science, operational research, management science and finance, who are interested in the formulation and computation of queueing networks, Markov chain models and related topics. Readers are expected to have some basic knowledge of probability theory, Markov processes and matrix theory.

Markov Chains: Models, Algorithms and Applications

Markov Chains: Models, Algorithms and Applications
Author: Wai-Ki Ching
Publisher: Springer Science & Business Media
Total Pages: 212
Release: 2006-06-05
Genre: Mathematics
ISBN: 038729337X

Markov chains are a particularly powerful and widely used tool for analyzing a variety of stochastic (probabilistic) systems over time. This monograph will present a series of Markov models, starting from the basic models and then building up to higher-order models. Included in the higher-order discussions are multivariate models, higher-order multivariate models, and higher-order hidden models. In each case, the focus is on the important kinds of applications that can be made with the class of models being considered in the current chapter. Special attention is given to numerical algorithms that can efficiently solve the models. Therefore, Markov Chains: Models, Algorithms and Applications outlines recent developments of Markov chain models for modeling queueing sequences, Internet, re-manufacturing systems, reverse logistics, inventory systems, bio-informatics, DNA sequences, genetic networks, data mining, and many other practical systems.

Sensitivity Analysis: Matrix Methods in Demography and Ecology

Sensitivity Analysis: Matrix Methods in Demography and Ecology
Author: Hal Caswell
Publisher: Springer
Total Pages: 308
Release: 2019-04-02
Genre: Social Science
ISBN: 3030105342

This open access book shows how to use sensitivity analysis in demography. It presents new methods for individuals, cohorts, and populations, with applications to humans, other animals, and plants. The analyses are based on matrix formulations of age-classified, stage-classified, and multistate population models. Methods are presented for linear and nonlinear, deterministic and stochastic, and time-invariant and time-varying cases. Readers will discover results on the sensitivity of statistics of longevity, life disparity, occupancy times, the net reproductive rate, and statistics of Markov chain models in demography. They will also see applications of sensitivity analysis to population growth rates, stable population structures, reproductive value, equilibria under immigration and nonlinearity, and population cycles. Individual stochasticity is a theme throughout, with a focus that goes beyond expected values to include variances in demographic outcomes. The calculations are easily and accurately implemented in matrix-oriented programming languages such as Matlab or R. Sensitivity analysis will help readers create models to predict the effect of future changes, to evaluate policy effects, and to identify possible evolutionary responses to the environment. Complete with many examples of the application, the book will be of interest to researchers and graduate students in human demography and population biology. The material will also appeal to those in mathematical biology and applied mathematics.

Markov Chains and Stochastic Stability

Markov Chains and Stochastic Stability
Author: Sean Meyn
Publisher: Cambridge University Press
Total Pages: 623
Release: 2009-04-02
Genre: Mathematics
ISBN: 0521731828

New up-to-date edition of this influential classic on Markov chains in general state spaces. Proofs are rigorous and concise, the range of applications is broad and knowledgeable, and key ideas are accessible to practitioners with limited mathematical background. New commentary by Sean Meyn, including updated references, reflects developments since 1996.

Hidden Markov Models for Time Series

Hidden Markov Models for Time Series
Author: Walter Zucchini
Publisher: CRC Press
Total Pages: 272
Release: 2017-12-19
Genre: Mathematics
ISBN: 1315355205

Hidden Markov Models for Time Series: An Introduction Using R, Second Edition illustrates the great flexibility of hidden Markov models (HMMs) as general-purpose models for time series data. The book provides a broad understanding of the models and their uses. After presenting the basic model formulation, the book covers estimation, forecasting, decoding, prediction, model selection, and Bayesian inference for HMMs. Through examples and applications, the authors describe how to extend and generalize the basic model so that it can be applied in a rich variety of situations. The book demonstrates how HMMs can be applied to a wide range of types of time series: continuous-valued, circular, multivariate, binary, bounded and unbounded counts, and categorical observations. It also discusses how to employ the freely available computing environment R to carry out the computations. Features Presents an accessible overview of HMMs Explores a variety of applications in ecology, finance, epidemiology, climatology, and sociology Includes numerous theoretical and programming exercises Provides most of the analysed data sets online New to the second edition A total of five chapters on extensions, including HMMs for longitudinal data, hidden semi-Markov models and models with continuous-valued state process New case studies on animal movement, rainfall occurrence and capture-recapture data

Hidden Markov Models

Hidden Markov Models
Author: Robert J Elliott
Publisher: Springer Science & Business Media
Total Pages: 374
Release: 2008-09-27
Genre: Science
ISBN: 0387848541

As more applications are found, interest in Hidden Markov Models continues to grow. Following comments and feedback from colleagues, students and other working with Hidden Markov Models the corrected 3rd printing of this volume contains clarifications, improvements and some new material, including results on smoothing for linear Gaussian dynamics. In Chapter 2 the derivation of the basic filters related to the Markov chain are each presented explicitly, rather than as special cases of one general filter. Furthermore, equations for smoothed estimates are given. The dynamics for the Kalman filter are derived as special cases of the authors’ general results and new expressions for a Kalman smoother are given. The Chapters on the control of Hidden Markov Chains are expanded and clarified. The revised Chapter 4 includes state estimation for discrete time Markov processes and Chapter 12 has a new section on robust control.