CIAOAlgorithms implements Block-Coordinate and Incremental Aggregated Optimization Algorithms for minimizations of the form
or
You can add CIAOAlgorithms by pressing ]
to enter the package manager, then
pkg> add CIAOAlgorithms
Simple Lasso and logisitc regression test examples can be found here.
Algorithm | Function | Reference |
---|---|---|
Finito/MISO/DIAG | Finito |
[1], [4], [8], [9] |
ProShI | Proshi |
[9] |
SAGA | SAGA |
[3], [6] |
SAG | SAG |
[7] |
SVRG/SVRG++ | SVRG |
[2], [4], [5] |
[1] Defazio, Domke, Finito: A faster, permutable incremental gradient method for big data problems, In International Conference on Machine Learning pp. 1125-1133 (2014).
[2] Xiao, Zhang, A proximal stochastic gradient method with progressive variance reduction, SIAM Journal on Optimization 24(4):2057–2075 (2014).
[3] Defazio, Bach, Lacoste-Julien, SAGA: A fast incremental gradient method with support for non-strongly convex composite objectives, In: Advances in neural information processing systems, pp. 1646–1654 (2014).
[4] Mairal, Incremental majorization-minimization optimization with application to large-scale machine learning SIAM Journal on Optimization 25(2), 829–855 (2015).
[5] Allen-Zhu, Yuan, Improved SVRG for non-strongly-convex or sum-of-non-convex objectives In Proceedings of the 33rd International Conference on Machine Learning 1080–1089 (2016).
[6] Reddi, Sra, Poczos, and Smola, Proximal stochastic methods for nonsmooth nonconvex finite-sum optimization In Advances in Neural Information Processing Systems, pp. 1145–1153 (2016).
[7] Schmidt, Le Roux, Bach, Minimizing finite sums with the stochastic average gradient Mathematical Programming, 162(1-2), 83-112 (2017).
[8] Mokhtari, Gürbüzbalaban, Ribeiro, Surpassing gradient descent provably: A cyclic incremental method with linear convergence rate SIAM Journal on Optimization 28(2), 1420–1447 (2018).
[9] Latafat, Themelis, Patrinos, Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems arXiv:1906.10053 (2019).