Theory of Statistical Inference and Information
Author | : Igor Vajda |
Publisher | : Springer |
Total Pages | : 440 |
Release | : 1989-02-28 |
Genre | : Mathematics |
ISBN | : |
Download Theory Of Statistical Inference And Information full books in PDF, epub, and Kindle. Read online free Theory Of Statistical Inference And Information ebook anywhere anytime directly on your device. Fast Download speed and no annoying ads. We cannot guarantee that every ebooks is available!
Author | : Igor Vajda |
Publisher | : Springer |
Total Pages | : 440 |
Release | : 1989-02-28 |
Genre | : Mathematics |
ISBN | : |
Author | : Frank Emmert-Streib |
Publisher | : Springer Science & Business Media |
Total Pages | : 443 |
Release | : 2009 |
Genre | : Computers |
ISBN | : 0387848150 |
This interdisciplinary text offers theoretical and practical results of information theoretic methods used in statistical learning. It presents a comprehensive overview of the many different methods that have been developed in numerous contexts.
Author | : Anthony Almudevar |
Publisher | : CRC Press |
Total Pages | : 1059 |
Release | : 2021-12-30 |
Genre | : Mathematics |
ISBN | : 1000488071 |
Theory of Statistical Inference is designed as a reference on statistical inference for researchers and students at the graduate or advanced undergraduate level. It presents a unified treatment of the foundational ideas of modern statistical inference, and would be suitable for a core course in a graduate program in statistics or biostatistics. The emphasis is on the application of mathematical theory to the problem of inference, leading to an optimization theory allowing the choice of those statistical methods yielding the most efficient use of data. The book shows how a small number of key concepts, such as sufficiency, invariance, stochastic ordering, decision theory and vector space algebra play a recurring and unifying role. The volume can be divided into four sections. Part I provides a review of the required distribution theory. Part II introduces the problem of statistical inference. This includes the definitions of the exponential family, invariant and Bayesian models. Basic concepts of estimation, confidence intervals and hypothesis testing are introduced here. Part III constitutes the core of the volume, presenting a formal theory of statistical inference. Beginning with decision theory, this section then covers uniformly minimum variance unbiased (UMVU) estimation, minimum risk equivariant (MRE) estimation and the Neyman-Pearson test. Finally, Part IV introduces large sample theory. This section begins with stochastic limit theorems, the δ-method, the Bahadur representation theorem for sample quantiles, large sample U-estimation, the Cramér-Rao lower bound and asymptotic efficiency. A separate chapter is then devoted to estimating equation methods. The volume ends with a detailed development of large sample hypothesis testing, based on the likelihood ratio test (LRT), Rao score test and the Wald test. Features This volume includes treatment of linear and nonlinear regression models, ANOVA models, generalized linear models (GLM) and generalized estimating equations (GEE). An introduction to decision theory (including risk, admissibility, classification, Bayes and minimax decision rules) is presented. The importance of this sometimes overlooked topic to statistical methodology is emphasized. The volume emphasizes throughout the important role that can be played by group theory and invariance in statistical inference. Nonparametric (rank-based) methods are derived by the same principles used for parametric models and are therefore presented as solutions to well-defined mathematical problems, rather than as robust heuristic alternatives to parametric methods. Each chapter ends with a set of theoretical and applied exercises integrated with the main text. Problems involving R programming are included. Appendices summarize the necessary background in analysis, matrix algebra and group theory.
Author | : Hannelore Liero |
Publisher | : CRC Press |
Total Pages | : 280 |
Release | : 2016-04-19 |
Genre | : Mathematics |
ISBN | : 1466503203 |
Based on the authors' lecture notes, this text presents concise yet complete coverage of statistical inference theory, focusing on the fundamental classical principles. Unlike related textbooks, it combines the theoretical basis of statistical inference with a useful applied toolbox that includes linear models. Suitable for a second semester undergraduate course on statistical inference, the text offers proofs to support the mathematics and does not require any use of measure theory. It illustrates core concepts using cartoons and provides solutions to all examples and problems.
Author | : David J. C. MacKay |
Publisher | : Cambridge University Press |
Total Pages | : 694 |
Release | : 2003-09-25 |
Genre | : Computers |
ISBN | : 9780521642989 |
Information theory and inference, taught together in this exciting textbook, lie at the heart of many important areas of modern technology - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics and cryptography. The book introduces theory in tandem with applications. Information theory is taught alongside practical communication systems such as arithmetic coding for data compression and sparse-graph codes for error-correction. Inference techniques, including message-passing algorithms, Monte Carlo methods and variational approximations, are developed alongside applications to clustering, convolutional codes, independent component analysis, and neural networks. Uniquely, the book covers state-of-the-art error-correcting codes, including low-density-parity-check codes, turbo codes, and digital fountain codes - the twenty-first-century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, the book is ideal for self-learning, and for undergraduate or graduate courses. It also provides an unparalleled entry point for professionals in areas as diverse as computational biology, financial engineering and machine learning.
Author | : Pierre Moulin |
Publisher | : Cambridge University Press |
Total Pages | : 423 |
Release | : 2019 |
Genre | : Mathematics |
ISBN | : 1107185920 |
A mathematically accessible textbook introducing all the tools needed to address modern inference problems in engineering and data science.
Author | : E.J.G. Pitman |
Publisher | : CRC Press |
Total Pages | : 110 |
Release | : 2018-01-18 |
Genre | : Mathematics |
ISBN | : 1351093673 |
In this book the author presents with elegance and precision some of the basic mathematical theory required for statistical inference at a level which will make it readable by most students of statistics.
Author | : Aris Spanos |
Publisher | : Cambridge University Press |
Total Pages | : 787 |
Release | : 2019-09-19 |
Genre | : Business & Economics |
ISBN | : 1107185149 |
This empirical research methods course enables informed implementation of statistical procedures, giving rise to trustworthy evidence.
Author | : Imre Csiszár |
Publisher | : Now Publishers Inc |
Total Pages | : 128 |
Release | : 2004 |
Genre | : Computers |
ISBN | : 9781933019055 |
Information Theory and Statistics: A Tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting. The topics covered include large deviations, hypothesis testing, maximum likelihood estimation in exponential families, analysis of contingency tables, and iterative algorithms with an "information geometry" background. Also, an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory. The tutorial does not assume the reader has an in-depth knowledge of Information Theory or statistics. As such, Information Theory and Statistics: A Tutorial, is an excellent introductory text to this highly-important topic in mathematics, computer science and electrical engineering. It provides both students and researchers with an invaluable resource to quickly get up to speed in the field.
Author | : C.S. Wallace |
Publisher | : Springer Science & Business Media |
Total Pages | : 456 |
Release | : 2005-05-26 |
Genre | : Computers |
ISBN | : 9780387237954 |
The Minimum Message Length (MML) Principle is an information-theoretic approach to induction, hypothesis testing, model selection, and statistical inference. MML, which provides a formal specification for the implementation of Occam's Razor, asserts that the ‘best’ explanation of observed data is the shortest. Further, an explanation is acceptable (i.e. the induction is justified) only if the explanation is shorter than the original data. This book gives a sound introduction to the Minimum Message Length Principle and its applications, provides the theoretical arguments for the adoption of the principle, and shows the development of certain approximations that assist its practical application. MML appears also to provide both a normative and a descriptive basis for inductive reasoning generally, and scientific induction in particular. The book describes this basis and aims to show its relevance to the Philosophy of Science. Statistical and Inductive Inference by Minimum Message Length will be of special interest to graduate students and researchers in Machine Learning and Data Mining, scientists and analysts in various disciplines wishing to make use of computer techniques for hypothesis discovery, statisticians and econometricians interested in the underlying theory of their discipline, and persons interested in the Philosophy of Science. The book could also be used in a graduate-level course in Machine Learning and Estimation and Model-selection, Econometrics and Data Mining. C.S. Wallace was appointed Foundation Chair of Computer Science at Monash University in 1968, at the age of 35, where he worked until his death in 2004. He received an ACM Fellowship in 1995, and was appointed Professor Emeritus in 1996. Professor Wallace made numerous significant contributions to diverse areas of Computer Science, such as Computer Architecture, Simulation and Machine Learning. His final research focused primarily on the Minimum Message Length Principle.