OptAzur: Optimization in French Riviera #
OptAzur is an ongoing effort to foster collaborations among members of Université Côte d’Azur on different aspect of optimization and its applications to machine learning, imaging and signal processing, etc.
OptAzur organizes a monthly seminar in Nice and Sophia-Antipolis, which alternates between the two sites and takes place on the third Monday of each month.
Next talk #
Monday, November 20th, 2023 (LJAD, Nice)
14h - Massimiliano Pontil (Italian Institute of Technology and University College London)
Learning Dynamical Systems Via Koopman Operator Regression
Non-linear dynamical systems can be handily described by the associated Koopman operator, whose action evolves every observable of the system forward in time. These operators are instrumental to forecasting and interpreting the system dynamics, and have broad applications in science and engineering. The talk gives a gentle introduction to this topic, with a focus on theory and algorithms. We highlight the importance of algorithms that allow us to estimate the spectral decomposition of the Koopman operator well and explore how the quest for good representations for these operators can be formulated as an optimization problem involving neural networks.
15h15 - Mathieu Carrière (Inria)
A Framework to Differentiate Persistent Homology with Applications in Machine Learning and Statistics
Solving optimization tasks based on functions and losses with a topological flavor is a very active and growing field of research in data science and Topological Data Analysis, with applications in non-convex optimization, statistics and machine learning. However, the approaches proposed in the literature are usually anchored to a specific application and/or topological construction, and do not come with theoretical guarantees. To address this issue, we study the differentiability of a general map associated with the most common topological construction, that is, the persistence map. Building on real analytic geometry arguments, we propose a general framework that allows to define and compute gradients for persistence-based functions in a very simple way. We also provide a simple, explicit and sufficient condition for convergence of stochastic subgradient methods for such functions. This result encompasses all the constructions and applications of topological optimization in the literature. Finally, we will showcase some associated code, that is easy to handle and to mix with other non-topological methods and constraints, as well as some experiments demonstrating the versatility of the approach.
Previous talks #
Titles and abstracts here
- #1: Jean-François Aujol (Université de Bordeaux) and Luca Calatroni (CNRS, I3S)
- #2: Gersende Fort (CNRS, Institut de Mathématiques de Toulouse) and Samuel Vaiter (CNRS, Laboratoire J. A. Dieudonné)
The members organize several conferences and workshops relevant to the optimization community, as detailled below.
Bilevel optimization in machine learning and imaging sciences workshop @ICIAM 2023, Tokyo, Japan. (Organizers: L. Calatroni, S. Vaiter)