Публикации

Показать фильтр

2022

Maslovskiy, A., Kunitsyn, A., Gasnikov, A. Application of Attention Technique for Digital Pre-distortion (2022) Communications in Computer and Information Science, 1739 CCIS, pp. 168-182. Scopus WOS DOI Q4

Pletnev, N.V., Dvurechensky, P.E., Gasnikov, A.V. Application of gradient optimization methods to solve the Cauchy problem for the Helmholtz equation [Применение градиентных методов оптимизации для решения задачи Коши для уравнения Гельмгольца] (2022) Computer Research and Modeling, 14 (2), pp. 417-444. Scopus WOS DOI Q4

Kovalev, D., Beznosikov, A., Borodich, E., Gasnikov, A., Scutari, G. Optimal Gradient Sliding and its Application to Distributed Optimization Under Similarity (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI 

Kovalev, D., Gasnikov, A., Richtárik, P. Accelerated Primal-Dual Gradient Method for Smooth and Convex-Concave Saddle-Point Problems with Bilinear Coupling (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI

Tian, Y., Scutari, G., Cao, T., Gasnikov, A. Acceleration in Distributed Optimization under Similarity (2022) Proceedings of Machine Learning Research, 151, pp. 5721-5756. Scopus WOS DOI

Beznosikov, A., Richtárik, P., Diskin, M., Ryabinin, M., Gasnikov, A. Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI

Beznosikov, A., Dvurechensky, P., Koloskova, A., Samokhin, V., Stich, S.U., Gasnikov, A. Decentralized Local Stochastic Extra-Gradient for Variational Inequalities (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI

Kovalev, D., Beznosikov, A., Sadiev, A., Persiianov, M., Richtárik, P., Gasnikov, A. Optimal Algorithms for Decentralized Stochastic Variational Inequalities (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI

Kovalev, D., Gasnikov, A. The First Optimal Algorithm for Smooth and Strongly-Convex-Strongly-Concave Minimax Optimization (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI

Gorbunov, E., Danilova, M., Dobre, D., Dvurechensky, P., Gasnikov, A., Gidel, G. Clipped Stochastic Methods for Variational Inequalities with Heavy-Tailed Noise (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI

Kovalev, D., Gasnikov, A. The First Optimal Acceleration of High-Order Methods in Smooth Convex Optimization (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI

Gasnikov, A., Novitskii, A., Novitskii, V., Abdukhakimov, F., Kamzolov, D., Beznosikov, A., Takáč, M., Dvurechensky, P., Gu, B. The Power of First-Order Smooth Optimization for Black-Box Non-Smooth Problems (2022) Proceedings of Machine Learning Research, 162, pp. 7241-7265. Scopus WOS DOI

Hanzely, S., Kamzolov, D., Pasechnyuk, D., Gasnikov, A., Richtárik, P., Takáč, M. A Damped Newton Method Achieves Global O(1/K2) and Local Quadratic Convergence Rate (2022) Advances in Neural Information Processing Systems, 35. Scopus WOS DOI

Tiapkin, D., Gasnikov, A. Primal-Dual Stochastic Mirror Descent for MDPs (2022) Proceedings of Machine Learning Research, 151, pp. 9723-9740. Scopus WOS DOI

2021

Dvurechensky, P., Gorbunov, E., Gasnikov, A. An accelerated directional derivative method for smooth stochastic convex optimization (2021) European Journal of Operational Research, 290 (2), pp. 601-621. Scopus DOI Q1

Используя этот сайт, вы соглашаетесь с тем, что мы используем файлы cookie.