Optimal Experimental Design

Optimal Experimental Design
Author: Jesús López-Fidalgo
Publisher: Springer Nature
Total Pages: 228
Release: 2023-10-14
Genre: Mathematics
ISBN: 3031359186

This textbook provides a concise introduction to optimal experimental design and efficiently prepares the reader for research in the area. It presents the common concepts and techniques for linear and nonlinear models as well as Bayesian optimal designs. The last two chapters are devoted to particular themes of interest, including recent developments and hot topics in optimal experimental design, and real-world applications. Numerous examples and exercises are included, some of them with solutions or hints, as well as references to the existing software for computing designs. The book is primarily intended for graduate students and young researchers in statistics and applied mathematics who are new to the field of optimal experimental design. Given the applications and the way concepts and results are introduced, parts of the text will also appeal to engineers and other applied researchers.

Asymptotic Expansions of Integrals

Asymptotic Expansions of Integrals
Author: Norman Bleistein
Publisher: Courier Corporation
Total Pages: 453
Release: 1986-01-01
Genre: Mathematics
ISBN: 0486650820

Excellent introductory text, written by two experts, presents a coherent and systematic view of principles and methods. Topics include integration by parts, Watson's lemma, LaPlace's method, stationary phase, and steepest descents. Additional subjects include the Mellin transform method and less elementary aspects of the method of steepest descents. 1975 edition.

Optimal Experimental Design for Large-scale Bayesian Inverse Problems

Optimal Experimental Design for Large-scale Bayesian Inverse Problems
Author: Keyi Wu (Ph. D.)
Publisher:
Total Pages: 0
Release: 2022
Genre:
ISBN:

Bayesian optimal experimental design (BOED)—including active learning, Bayesian optimization, and sensor placement—provides a probabilistic framework to maximize the expected information gain (EIG) or mutual information (MI) for uncertain parameters or quantities of interest with limited experimental data. However, evaluating the EIG remains prohibitive for largescale complex models due to the need to compute double integrals with respect to both the parameter and data distributions. In this work, we develop a fast and scalable computational framework to solve Bayesian optimal experimental design (OED) problems governed by partial differential equations (PDEs) with application to optimal sensor placement by maximizing the EIG. We (1) exploit the low-rank structure of the Jacobian of the parameter-to-observable map to extract the intrinsic low-dimensional data-informed subspace, and (2) employ a series of approximations of the EIG that reduce the number of PDE solves while retaining a high correlation with the true EIG. This allows us to propose an efficient offline–online decomposition for the optimization problem, using a new swapping greedy algorithm for both OED problems and goal-oriented linear OED problems. The offline stage dominates the cost and entails precomputing all components requiring PDE solusion. The online stage optimizes sensor placement and does not require any PDE solves. We provide a detailed error analysis with an upper bound for the approximation error in evaluating the EIG for OED and goal-oriented OED linear cases. Finally, we evaluate the EIG with a derivative-informed projected neural network (DIPNet) surrogate for parameter-to-observable maps. With this surrogate, no further PDE solves are required to solve the optimization problem. We provided an analysis of the error propagated from the DIPNet approximation to the approximation of the normalization constant and the EIG under suitable assumptions. We demonstrate the efficiency and scalability of the proposed methods for both linear inverse problems, in which one seeks to infer the initial condition for an advection–diffusion equation, and nonlinear inverse problems, in which one seeks to infer coefficients for a Poisson problem, an acoustic Helmholtz problem and an advection–diffusion–reaction problem. This dissertation is based on the following articles: A fast and scalable computational framework for large-scale and high-dimensional Bayesian optimal experimental design by Keyi Wu, Peng Chen, and Omar Ghattas [88]; An efficient method for goal-oriented linear Bayesian optimal experimental design: Application to optimal sensor placement by Keyi Wu, Peng Chen, and Omar Ghattas [89]; and Derivative-informed projected neural network for large-scale Bayesian optimal experimental design by Keyi Wu, Thomas O’Leary-Roseberry, Peng Chen, and Omar Ghattas [90]. This material is based upon work partially funded by DOE ASCR DE-SC0019303 and DESC0021239, DOD MURI FA9550-21-1-0084, and NSF DMS-2012453

Optimum Experimental Designs, With SAS

Optimum Experimental Designs, With SAS
Author: Anthony Atkinson
Publisher: OUP Oxford
Total Pages: 528
Release: 2007-05-24
Genre: Mathematics
ISBN: 0191537942

Experiments on patients, processes or plants all have random error, making statistical methods essential for their efficient design and analysis. This book presents the theory and methods of optimum experimental design, making them available through the use of SAS programs. Little previous statistical knowledge is assumed. The first part of the book stresses the importance of models in the analysis of data and introduces least squares fitting and simple optimum experimental designs. The second part presents a more detailed discussion of the general theory and of a wide variety of experiments. The book stresses the use of SAS to provide hands-on solutions for the construction of designs in both standard and non-standard situations. The mathematical theory of the designs is developed in parallel with their construction in SAS, so providing motivation for the development of the subject. Many chapters cover self-contained topics drawn from science, engineering and pharmaceutical investigations, such as response surface designs, blocking of experiments, designs for mixture experiments and for nonlinear and generalized linear models. Understanding is aided by the provision of "SAS tasks" after most chapters as well as by more traditional exercises and a fully supported website. The authors are leading experts in key fields and this book is ideal for statisticians and scientists in academia, research and the process and pharmaceutical industries.

Numerical Approaches for Sequential Bayesian Optimal Experimental Design

Numerical Approaches for Sequential Bayesian Optimal Experimental Design
Author: Xun Huan
Publisher:
Total Pages: 186
Release: 2015
Genre:
ISBN:

Experimental data play a crucial role in developing and refining models of physical systems. Some experiments can be more valuable than others, however. Well-chosen experiments can save substantial resources, and hence optimal experimental design (OED) seeks to quantify and maximize the value of experimental data. Common current practice for designing a sequence of experiments uses suboptimal approaches: batch (open-loop) design that chooses all experiments simultaneously with no feedback of information, or greedy (myopic) design that optimally selects the next experiment without accounting for future observations and dynamics. In contrast, sequential optimal experimental design (sOED) is free of these limitations. With the goal of acquiring experimental data that are optimal for model parameter inference, we develop a rigorous Bayesian formulation for OED using an objective that incorporates a measure of information gain. This framework is first demonstrated in a batch design setting, and then extended to sOED using a dynamic programming (DP) formulation. We also develop new numerical tools for sOED to accommodate nonlinear models with continuous (and often unbounded) parameter, design, and observation spaces. Two major techniques are employed to make solution of the DP problem computationally feasible. First, the optimal policy is sought using a one-step lookahead representation combined with approximate value iteration. This approximate dynamic programming method couples backward induction and regression to construct value function approximations. It also iteratively generates trajectories via exploration and exploitation to further improve approximation accuracy in frequently visited regions of the state space. Second, transport maps are used to represent belief states, which reflect the intermediate posteriors within the sequential design process. Transport maps offer a finite-dimensional representation of these generally non-Gaussian random variables, and also enable fast approximate Bayesian inference, which must be performed millions of times under nested combinations of optimization and Monte Carlo sampling. The overall sOED algorithm is demonstrated and verified against analytic solutions on a simple linear-Gaussian model. Its advantages over batch and greedy designs are then shown via a nonlinear application of optimal sequential sensing: inferring contaminant source location from a sensor in a time-dependent convection-diffusion system. Finally, the capability of the algorithm is tested for multidimensional parameter and design spaces in a more complex setting of the source inversion problem.

Optimum Experimental Designs, With SAS

Optimum Experimental Designs, With SAS
Author: Anthony Atkinson
Publisher: Oxford University Press, USA
Total Pages: 528
Release: 2007-05-24
Genre: Business & Economics
ISBN: 0199296596

Experiments in the field and in the laboratory cannot avoid random error and statistical methods are essential for their efficient design and analysis. Authored by leading experts in key fields, this text provides many examples of SAS code, results, plots and tables, along with a fully supported website.

Optimal Bayesian Experimental Design in the Presence of Model Error

Optimal Bayesian Experimental Design in the Presence of Model Error
Author:
Publisher:
Total Pages: 90
Release: 2015
Genre:
ISBN:

The optimal selection of experimental conditions is essential to maximizing the value of data for inference and prediction. We propose an information theoretic framework and algorithms for robust optimal experimental design with simulation-based models, with the goal of maximizing information gain in targeted subsets of model parameters, particularly in situations where experiments are costly. Our framework employs a Bayesian statistical setting, which naturally incorporates heterogeneous sources of information. An objective function reflects expected information gain from proposed experimental designs. Monte Carlo sampling is used to evaluate the expected information gain, and stochastic approximation algorithms make optimization feasible for computationally intensive and high-dimensional problems. A key aspect of our framework is the introduction of model calibration discrepancy terms that are used to "relax" the model so that proposed optimal experiments are more robust to model error or inadequacy. We illustrate the approach via several model problems and misspecification scenarios. In particular, we show how optimal designs are modified by allowing for model error, and we evaluate the performance of various designs by simulating "real-world" data from models not considered explicitly in the optimization objective.