Valid Until: 2027-04-30 23:59
Abstract: In this talk, I will discuss several topics. First, is the optimization over low-rank matrix and tensor manifolds, which often appear in applications. Low-rank approximation of matrices is one of the rare examples when a non-convex problem can be solved in a numerically exact way by using singular value decomposition (SVD). There also exists a large class of methods for solving optimization with low-constraints.
In the second part of the talk (if time permits), I will discuss the peculiarities of optimization with deep neural networks. The theory of such optimization is still a big mystery, with a lot of empirical results and theoretical results under unrealistic assumptions. Here I plan to highlight the main points and research directions.
Bio: Ivan Oseledets is a Director of the Center for Artificial Intelligence Technology, Head of the Laboratory of Computational Intelligence, Skoltech.
Ivan’s research covers a broad range of topics. He proposed a new decomposition of high-dimensional arrays (tensors) – tensor-train decomposition and developed many efficient algorithms for solving high-dimensional problems. These algorithms are used in different areas of chemistry, biology, data analysis and machine learning. His current research focuses on the development of new algorithms in machine learning and artificial intelligence such as the construction of adversarial examples, the theory of generative adversarial networks and the compression of neural networks.
Ivan Oseledets got several awards for his research and industrial cooperation, including two gold medals from the Russian Academy of Sciences (for students in 2005 and young researchers in 2009), the SIAM Outstanding Paper Prize (2018), the Russian President Award for young researchers in science and innovation (2018), Moscow Government Prize for Young Scientists (2023), Best Professor award from Skoltech (2019), the best cooperation project leader award from Huawei (2015, 2017).