A first-order primal-dual method with adaptivity to local smoothness

Authors: Maria-Luiza Vladarean, Yura Malitsky, Volkan Cevher

NeurIPS 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Numerical experiments are provided for illustrating the practical performance of the algorithm. We now present some numerical experiments conducted for APDA.
Researcher Affiliation Academia Maria-Luiza Vladarean Yura Malitsky Volkan Cevher {maria-luiza.vladarean, volkan.cevher}@epfl.ch yurii.malitskyi@liu.se LIONS, École Polytechnique Fédérale de Lausanne, Switzerland Linköping University, Sweden
Pseudocode Yes Algorithm 1 Adaptive Primal Dual Algorithm (APDA) Input: x0 X, y0 Y, τinit > 0, τ0 = , θ0 = 1, β > 0, c (0, 1) x1 = x0 τinit( f(x0) + AT y0) for k = 1, 2, . . . do Set τk = min 1 2 L2 k+(β/(1 c)) A 2 , τk 1 p , σk = βτk, θk = τk τk 1 xk = xk + θk(xk xk 1) yk+1 = proxσkg (yk + σk A xk) xk+1 = xk τk( f(xk) + AT yk+1) end for
Open Source Code Yes 1See https://github.com/mvladarean/adaptive_pda.
Open Datasets Yes We consider the problem of sparse binary Logistic Regression on 4 LIBSVM datasets [Chang and Lin, 2011]
Dataset Splits No The paper mentions using LIBSVM datasets but does not provide specific details on training, validation, or test splits such as percentages or sample counts.
Hardware Specification Yes The experiments were implemented in Python 3.9 and executed on a Mac Book Pro with 32 GB RAM and a 2,9 GHz 6-Core Intel Core i9 processor.
Software Dependencies No The experiments were implemented in Python 3.9 and executed on a Mac Book Pro with 32 GB RAM and a 2,9 GHz 6-Core Intel Core i9 processor.
Experiment Setup Yes We choose λ = 0.005 QT b. For APDA we perform a parameter sweep over β [1e-3, 1e6] for each dataset and settle for: β = 2.68e3 for ijcnn; β = 5.18e4 for a9a; β = 3.16e1 for mushrooms; β = 3.73e-1 for covtype. For CVA we sweep p [1e-3, 1e6] and set τ = 1 A /p+L and σ = 1 p A...