We are delighted to welcome Amir Asadi from University of Cambridge to talk about their work.
Title: Multiscale Machine Learning: An Information-Theoretic Approach
Abstract: Machine learning models draw on two fundamental sources of information when learning a task: observed empirical data and prior information about the structure of the task. As data dimensionality increases in modern applications, the curse of dimensionality makes such structural information increasingly crucial for effective learning and generalisation. In this talk, I will present an information-theoretic framework for multiscale machine learning, which studies learning problems where the data are known to possess a multiscale structure, and this property is incorporated as prior knowledge. The key idea is to view learning as a dynamic programming process across scales: a complex problem is decomposed into scale-separated subproblems, solved hierarchically, and recombined to exploit scale-smoothness and scale-invariance.
I will illustrate this viewpoint through three applications: in sampling, by studying maximum-entropy and maximum hierarchical-entropy distributions with modular Gaussian and Ising models; in predictive modelling, through recent work on ladder decompositions of diffeomorphisms; and in generative modelling, by discussing studies on the generation of natural images.
Please email informed-ai@bristol.ac.uk if you’d like to register and join the seminar either in person at University of Cambridge or online.