Date: 2023-05-03

Time: 14:00-15:00 (UK time)

Bush House (NE) 1.04

Abstract

All vector autoregressive processes have an associated order p; conditional on observations at the preceding p time points, the variable at time t is conditionally independent of the earlier history. Learning the order of the process is therefore important for its characterisation and subsequent use in forecasting. For example, the order can serve as a point of comparison between different data sets and informs the decomposition of the time series into latent processes, which provides information about the underlying dynamics. It is common to assume that a vector autoregressive process is stationary. A vector autoregression is stable if and only if the roots of its characteristic equation lie outside the unit circle, which constrains the autoregressive coefficient matrices to lie in the stationary region. Unfortunately, the geometry of the stationary region can be very complicated, and specification of a prior distribution over this region is difficult. In this work, the autoregressive coefficients are mapped to a set of transformed partial autocorrelation matrices which are unconstrained, allowing for easier prior specification, routine computational inference, and meaningful interpretation of the magnitude of the elements in the matrix. The multiplicative gamma process is used to build a prior distribution for the unconstrained matrices, which encourages increasing shrinkage of the partial autocorrelation parameters as the lag increases. Posterior inference is performed using Hamiltonian Monte Carlo via the probabilistic programming language Stan. Samples from the posterior distribution of the order of the process use a truncation criterion which is motivated by classical theory on the sampling distribution of the partial autocorrelation function. The work is applied in a simulation study to investigate the agreement between the posterior distribution for the order of the process and its known value, with promising results. The model and inferential procedures are then applied to EEG data from epilepsy patients in order to assess differences in the structure of the time series across different frequency bands.

Speaker

Sarah Heaps is an associate professor in Statistics at Durham University. She has a Ph.D. in Statistics from Newcastle University. Her research lies in the field of applied Bayesian inference, particularly in the areas of time series analysis and bioinformatics. Interdisciplinary collaboration and the development of prior structures for representing beliefs about multivariate parameters are recurring themes in her research.