Публикации

Показать фильтр

2023

Tominin, Y.D., Tominin, V.D., Borodich, E.D., Kovalev, D.A., Dvurechensky, P.E., Gasnikov, A.V., Chukanov, S.V. On Accelerated Methods for Saddle-Point Problems with Composite Structure [Об ускоренных методах для седловых задач с композитной структурой] (2023) Computer Research and Modeling, 15 (2), pp. 433-467. Scopus DOI Q4

Lobanov, A., Anikin, A., Gasnikov, A., Gornov, A., Chukanov, S. Zero-Order Stochastic Conditional Gradient Sliding Method for Non-smooth Convex Optimization (2023) Communications in Computer and Information Science, 1881 CCIS, pp. 92-106. Scopus DOI Q4

Savchuk, O., Stonyakin, F., Alkousa, M., Zabirova, R., Titov, A., Gasnikov, A. Online Optimization Problems with Functional Constraints Under Relative Lipschitz Continuity and Relative Strong Convexity Conditions (2023) Communications in Computer and Information Science, 1881 CCIS, pp. 29-43. Scopus DOI  Q4

Dvurechensky, P., Gasnikov, A., Tyurin, A., Zholobov, V. Unifying Framework for Accelerated Randomized Methods in Convex Optimization (2023) Springer Proceedings in Mathematics and Statistics, 425, pp. 511-561. Scopus DOI

2022

Kovalev, D., Beznosikov, A., Borodich, E., Gasnikov, A., Scutari, G. Optimal Gradient Sliding and its Application to Distributed Optimization Under Similarity (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI  A*

Kovalev, D., Gasnikov, A., Richtárik, P. Accelerated Primal-Dual Gradient Method for Smooth and Convex-Concave Saddle-Point Problems with Bilinear Coupling (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Tian, Y., Scutari, G., Cao, T., Gasnikov, A. Acceleration in Distributed Optimization under Similarity (2022) Proceedings of Machine Learning Research, 151, pp. 5721-5756. Scopus WOS DOI A*

Beznosikov, A., Richtárik, P., Diskin, M., Ryabinin, M., Gasnikov, A. Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Beznosikov, A., Dvurechensky, P., Koloskova, A., Samokhin, V., Stich, S.U., Gasnikov, A. Decentralized Local Stochastic Extra-Gradient for Variational Inequalities (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Kovalev, D., Beznosikov, A., Sadiev, A., Persiianov, M., Richtárik, P., Gasnikov, A. Optimal Algorithms for Decentralized Stochastic Variational Inequalities (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Kovalev, D., Gasnikov, A. The First Optimal Algorithm for Smooth and Strongly-Convex-Strongly-Concave Minimax Optimization (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Gorbunov, E., Danilova, M., Dobre, D., Dvurechensky, P., Gasnikov, A., Gidel, G. Clipped Stochastic Methods for Variational Inequalities with Heavy-Tailed Noise (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Kovalev, D., Gasnikov, A. The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Gasnikov, A., Novitskii, A., Novitskii, V., Abdukhakimov, F., Kamzolov, D., Beznosikov, A., Takáč, M., Dvurechensky, P., Gu, B. The Power of First-Order Smooth Optimization for Black-Box Non-Smooth Problems (2022) Proceedings of Machine Learning Research, 162, pp. 7241-7265. Scopus WOS DOI A*

Hanzely, S., Kamzolov, D., Pasechnyuk, D., Gasnikov, A., Richtárik, P., Takáč, M. A Damped Newton Method Achieves Global O(1/K2) and Local Quadratic Convergence Rate (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Используя этот сайт, вы соглашаетесь с тем, что мы используем файлы cookie.