Публикации

Показать фильтр

2022

Kovalev, D., Gasnikov, A., Richtárik, P. Accelerated Primal-Dual Gradient Method for Smooth and Convex-Concave Saddle-Point Problems with Bilinear Coupling (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Tian, Y., Scutari, G., Cao, T., Gasnikov, A. Acceleration in Distributed Optimization under Similarity (2022) Proceedings of Machine Learning Research, 151, pp. 5721-5756. Scopus WOS DOI A*

Beznosikov, A., Richtárik, P., Diskin, M., Ryabinin, M., Gasnikov, A. Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Beznosikov, A., Dvurechensky, P., Koloskova, A., Samokhin, V., Stich, S.U., Gasnikov, A. Decentralized Local Stochastic Extra-Gradient for Variational Inequalities (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Kovalev, D., Beznosikov, A., Sadiev, A., Persiianov, M., Richtárik, P., Gasnikov, A. Optimal Algorithms for Decentralized Stochastic Variational Inequalities (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Kovalev, D., Gasnikov, A. The First Optimal Algorithm for Smooth and Strongly-Convex-Strongly-Concave Minimax Optimization (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Gorbunov, E., Danilova, M., Dobre, D., Dvurechensky, P., Gasnikov, A., Gidel, G. Clipped Stochastic Methods for Variational Inequalities with Heavy-Tailed Noise (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Kovalev, D., Gasnikov, A. The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Gasnikov, A., Novitskii, A., Novitskii, V., Abdukhakimov, F., Kamzolov, D., Beznosikov, A., Takáč, M., Dvurechensky, P., Gu, B. The Power of First-Order Smooth Optimization for Black-Box Non-Smooth Problems (2022) Proceedings of Machine Learning Research, 162, pp. 7241-7265. Scopus WOS DOI A*

Hanzely, S., Kamzolov, D., Pasechnyuk, D., Gasnikov, A., Richtárik, P., Takáč, M. A Damped Newton Method Achieves Global O(1/K2) and Local Quadratic Convergence Rate (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI A*

Tiapkin, D., Gasnikov, A. Primal-Dual Stochastic Mirror Descent for MDPs (2022) Proceedings of Machine Learning Research, 151, pp. 9723-9740. Scopus WOS DOI A*

Stonyakin F. , Gasnikov A. , Dvurechensky P. , Titov A. , Alkousa M. Generalized Mirror Prox Algorithm for Monotone Variational Inequalities: Universality and Inexact Oracle // Journal of Optimization Theory and Applications, Vol. 194, No. 3, P. 988 - 1013 Scopus WOS DOI Q1

Gorbunov E., Dvurechensky P., Gasnikov A. An accelerated method for derivative-free smooth stochastic convex optimization // SIAM Journal on Optimization, Vol. 32, No. 2, P. 1210 - 1238 Scopus WOS DOI Q1

Ivanova A., Dvurechensky P., Vorontsova E., Pasechnyuk D., Gasnikov A., Dvinskikh D., Tyurin A. Oracle Complexity Separation in Convex Optimization // Journal of Optimization Theory and Applications, Vol. 193, No. 1, P. 462 - 490 Scopus WOS DOI Q1

Anikin A., Gasnikov A., Gornov A., Kamzolov D., Maximov Y., Nesterov Y. Efficient numerical methods to solve sparse linear equations with application to PageRank // Optimization Methods and Software, Vol. 37, No. 3, P. 907 - 935 Scopus WOS DOI Q1

Используя этот сайт, вы соглашаетесь с тем, что мы используем файлы cookie.