Date: 2023-10-26

Time: 14:00-15:00 (UK time)

Strand S5.20

Abstract

Gaussian Processes and the Kullback-Leibler divergence have been deeply studied in Statistics and Machine Learning. This paper marries these two concepts and introduce the local Kullback-Leibler divergence to learn about intervals where two Gaussian Processes differ the most. We address subtleties entailed in the estimation of local divergences and the corresponding interval of local maximum divergence as well. The estimation performance and the numerical efficiency of the proposed method are showcased via a Monte Carlo simulation study. In a medical research context, we assess the potential of the devised tools in the analysis of electrocardiogram signals.

Speaker

Dr. Nicolás Hernández is a Senior Research Fellow of the Institute of Mathematics and Statistical Science of University College London. His main research is oriented to develop statisical and machine learning methods to tackle inferential problems in high-dimensional and functional data over different fields such as: energy, economics, the environment, demography, business, finance, health and genetics.