Articles by laboratory staff were submitted to international conferences on machine learning ICLR and AISTATS. Young members of the laboratory also participated in the work on the articles.

The following works were highlighted:

1) Stochastic Frank-Wolfe: Unified Analysis and New Faces of Classical Method (AISTATS). Authors: Ruslan Nazykov, Alexander Shestakov, Vladimir Solodkin, Alexander Gasnikov, Alexander Beznosikov;

2) Breaking the Heavy-Tailed Noise Barrier in Stochastic Optimization Problems (AISTATS). Authors: Nikita Puchkin, Eduard Gorbunov, Nikolai Kutuzov and Alexander Gasnikov;

3) Ito Diffusion Approximation of Universal Ito Chains for Sampling, Optimization and Boosting (ICLR). Authors: Alexey Ustimenko and Alexander Beznosikov.

4) Advancing the lower bounds: An accelerated, stochastic, second-order method with optimal adaptation to inexactness (ICLR). Authors: Artem Agafonov, Dmitry Kamzolov, Alexander Gasnikov and others.

5) Communication Compression for Byzantine Robust Learning: New Efficient Algorithms and Improved Rates (AISTATS). Authors: Nikita Fedin, Eduard Gorbunov and others.

“We started working on the article at the beginning of February 2023 as part of the optimization course. We chose this particular topic, since the creation of a universal method for analyzing Frank-Wolff type algorithms will allow us to obtain new convergent algorithms. Moreover, it will expand our understanding of these types of algorithms. I have proven the convergence of stochastic Frank-Wolfe type algorithms under certain stochastic gradient assumptions; found the asymptotics of some methods, in particular in distributed optimization; took a big part in compiling the appendix of the article,” says Alexander Shestakov, one of the authors of the article Stochastic Frank-Wolfe: Unified Analysis and New Faces of Classical Method.

According to the researcher, the article was able to prove and verify that, within certain restrictions on the random gradient (which are quite natural for existing algorithms), Frank-Wolfe type algorithms converge. The team was able to refine the speed of this convergence.

“This result expands our understanding of algorithms of this type and makes it possible to study the convergence of new algorithms using much simpler techniques than before,” Alexander summarized.

The AISTATS Conference on Artificial Intelligence and Statistics brings together researchers working at the intersection of computer science, artificial intelligence, machine learning, statistics and related fields. Topics covered at the conference include machine learning methods and algorithms, probabilistic methods, reinforcement learning and others.

At the same time, the ICLR conference is dedicated to the development of the artificial intelligence industry. She is renowned worldwide for cutting-edge research on all aspects of deep learning used in artificial intelligence, statistics, and data science, as well as in critical application areas. The latter include computational biology, speech recognition, text understanding, games, robotics and more.

Используя этот сайт, вы соглашаетесь с тем, что мы используем файлы cookie.