Skip to main navigationSkip to main content
The University of Southampton
Southampton Statistical Sciences Research Institute

Manchester - Southampton - Glasgow Design of Experiments Seminars Seminar

Time:
14:00 - 17:00
Date:
1 June 2016
Venue:
University of Southampton, Highfield Campus, Building 27, Room 2003 (Lecture Room 2)

Event details

Workshop organised by Dave Woods (Southampton)

 

14.00 - Welcome

14.05 - Werner Mueller

14.55 - Tim Muiruri Kinyanjui

15.45 - Coffee

16.10 - Ben Torsney

17.00 - Finish. Drinks and further discussion in the Crown

Abstracts

Werner Mueller, Johannes Kepler Universitat Linz, Austria

ABCD: Approximate Bayesian Computing Design

We are concerned with improving data collecting schemes via methods of optimum experimental design, which can be applied in cases where the experimenter has at least partial control over the experimental conditions. Furthermore we focus on cases where a probability model for the investigated phenomenon is not easily available and the situation lends itself naturally to simulation-based approaches in conjunction with a recently popularized simulation technique called approximate Bayesian computing (ABC).

The objective of optimum experimental design is to find the best possible configuration of factor settings with respect to a well-defined criterion or measure of information for a specific statistical model. In Bayesian experimental design, a prior distribution is attached to the parameters of the statistical model. This prior distribution reflects prior knowledge about the parameters of the model. In the Bayesian setting it is natural to average a criterion over the parameter values with respect to the prior distribution. In a decision-theoretic approach to experimental design the criterion of interest is computed for the posterior distribution of the parameters and then averaged over the marginal distribution of the data. The information criterion on the posterior distribution reflects some notion of learning from the observations.

The computation of the expected criterion value can be a challenging task. Usually this involves the evaluation of integrals or sums. If the integrals are analytically intractable and numerical integration routines do not work, Monte Carlo simulation strategies can be applied in a framework of stochastic optimization. Some of our proposed methods will be based on simulation-based optimal design algorithms which utilize Markov chain Monte Carlo (MCMC) methods, but we intend to go beyond that class. Simulation-based methods make it possible to efficiently solve a wider range of problems for which standard methods cannot provide tractable solutions. In this presentation we outline potentials and limitations of ABC for design purposes. Furthermore we will report details on an application for dealing with spatial extremes.

References:

Hainy, M., Müller, W. G., Wynn, H. P., 2013. Approximate Bayesian computation design (ABCD), an introduction. In: Ucinski, D., Atkinson, A. C., Patan, M. (Eds.), mODa 10 – Advances in Model-Oriented Design and Analysis. Contributions to Statistics. Springer International Publishing, pp. 135-143.

Hainy, M., Müller, W., Wagner, H., 2015. Likelihood-free simulation-based optimal design with an application to spatial extremes. Stochastic Environmental Research and Risk Assessment, 1-12.

Tim muiuri Kinyanjui (S3RI seminar speaker), University of Manchester

Title: Information content of household-stratified epidemics

Abstract: Household structure is a key driver of many infectious diseases, as well as a natural target for interventions such as vaccination programs. Many theoretical and conceptual advances on household-stratified epidemic models are relatively recent, but have successfully managed to increase the applicability of such models to practical problems. To be of maximum realism and hence benefit, they require parameterisation from epidemiological data, and while household-stratified final size data has been the traditional source, increasingly time-series infection data from households are becoming available. This paper is concerned with the design of studies aimed at collecting time-series epidemic data in order to maximize the amount of information available to calibrate household models. A design decision involves a trade-off between the number of households to enrol and the sampling frequency. Two commonly used epidemiological study designs are considered: cross-sectional, where different households are sampled at every time point, and cohort, where the same households are followed over the course of the study period. The search for an optimal design uses Bayesian computationally intensive methods to explore the joint parameter-design space combined with the Shannon entropy of the posteriors to estimate the amount of information in each design. For the cross-sectional design, the amount of information increases with the sampling intensity i.e. the designs with the highest number of time points have the most information. On the other hand, the cohort design often exhibits a trade-off between the number of households sampled and the intensity of follow-up. Our results broadly support the choices made in existing epidemiological data collection studies. Prospective problem-specific use of our computational methods can bring significant benefits in guiding future study designs.

 

Ben Torsney (S3RI seminar speaker), University of Glasgow

Title: Optimal Design, Lagrangian and Linear Model Theories: a Fusion.

Abstract: We consider the problem of optimizing a criterion of several variables, subject to them satisfying several linear equality constraints. Lagrangian Theory requires that, at an optimum, all partial derivatives be exactly linear in a set of Lagrange Multipliers. It seems we can argue that the partial derivatives, viewed as response variables, must exactly satisfy a Linear Model with the Lagrange Multipliers as parameters. This then is a model without errors implying a fitted model with zero residuals. A special case is an approximate optimal design problem with the single

"summation to one" constraint. In this case one formula for residuals defines "vertex directional derivatives". In our general problem any formula for residuals appears to play the role of directional derivatives.Further, if all variables are nonnegative, we can extend the multiplicative algorithm formulated for finding optimal design weights. This comprises two steps: a multiplicative step, under which we multiply each variable by a positive increasing function of its (vertex directional) derivative; and a scaling step, under which we scale these products to meet the "summation to one" constraint.The multiplicative step naturally extends to our more general problem and we believe that we have discovered a generalisation of the scaling step. Numerical results will be reported.

Speaker information

Werner Mueller, Johannes Kepler Universitat Linz. Austria

Tim muiuri Kinyanjui , University of Manchester. Research Associate

Dr Bernard Torsney, University of Glasgow. Honorary Research Fellow (Statistics)

Privacy Settings