The laboratory staff received a prize for the best paper at the 22nd International Optimization Conference MOTOR. The event was held from 2 to 8 July 2023 in Yekaterinburg. The conference was dedicated to the 90th anniversary of Academician I.I. Eremin.

The authors of the paper «Online Optimization Problems with Functional Constraints under Relative Lipschitz Continuity and Relative Strong Convexity Conditions» are Oleg Savchuk, Fedor Stonyakin, Mohammad Alkousa, Rida Zabirova, Alexander Titov & Alexander Gasnikov.

According to one of the main authors of the paper Oleg Savchuk, the aim of the work was to investigate the computational guarantees of mirror descent algorithms on a class of convex online optimization problems with functional constraints-inequalities under conditions of relative Lipschitzness and relative strong convexity.

«Combining the ideas of adaptive regularization and convergence of the mirror descent method with productive switching
and to unproductive steps on a class of relatively strongly convex online optimization problems with convex constraints of the inequality type, we proposed extensions of the mirror descent method. By this, we understood a scheme with switching over productive and unproductive steps with and without iterative regularization for relatively strongly convex and relatively Lipschitz online optimization problems with functional constraints. The proposed approach eliminates the need to know in advance the lower bound of the parameters of the (relative) strong convexity of the observed functions and may allow avoiding additional design operations on an acceptable set (if the latter is described by a system of inequalities) during iterations,» Savchuk shared.

For all the proposed algorithms, the corresponding theoretical results and regression estimates are substantiated, which improve the existing estimates of the convergence rate of the mirror descent method with functional constraints. Oleg Savchuk's future plans are to continue to master the postgraduate program and successfully complete it, as well as prepare and successfully defend his PhD thesis.

Mohammad Alkusa made a number of computational experiments comparing the proposed methodology with similar well-known approaches that do not take into account assumptions about strong convexity, and also together with Alexander Titov helped with the introduction and proofreading of the text of the paper. A laboratory employee shared his impressions about participating in the conference:

«Alexander Gasnikov and Fеdor Stonyakin proposed a general idea of the work with a plan. My contribution mainly related to the section on the computational experiments and proofreading of the English text of the paper. Fedor Stonyakin asked me to do a report and present the results of this paper at the conference (the presentation was online via Zoom). For the presentation at the conference, everything went well. In general, since 2019, at MOTOR (and OPTIMA) conferences, we regularly publish at least one paper. In the future, there are many plans for working and projects with my colleagues, among the topics which we can mention: Mirror Descent method for non-smooth optimization problems with a sharp minimum, conditional gradient type methods, and their applications, stochastic gradient type methods, distributed optimization methods, intermediate methods for smooth convex  problems with inexact gradient».

The conference brought together the research community in the field of mathematical programming and global optimization, discrete optimization, complexity theory and combinatorial algorithms, optimal control and games, as well as their applications in actual practical problems of operations research, mathematical economics and data analysis.

The laboratory staff congratulates colleagues on this achievement and wishes further success in research!

Используя этот сайт, вы соглашаетесь с тем, что мы используем файлы cookie.