Convex Optimization Problems

2011 ◽  
pp. 479-512
2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Darina Dvinskikh ◽  
Alexander Gasnikov

Abstract We introduce primal and dual stochastic gradient oracle methods for decentralized convex optimization problems. Both for primal and dual oracles, the proposed methods are optimal in terms of the number of communication steps. However, for all classes of the objective, the optimality in terms of the number of oracle calls per node takes place only up to a logarithmic factor and the notion of smoothness. By using mini-batching technique, we show that the proposed methods with stochastic oracle can be additionally parallelized at each node. The considered algorithms can be applied to many data science problems and inverse problems.


2018 ◽  
Vol 5 (1) ◽  
pp. 42-60 ◽  
Author(s):  
Akshay Agrawal ◽  
Robin Verschueren ◽  
Steven Diamond ◽  
Stephen Boyd

Sign in / Sign up

Export Citation Format

Share Document