IDEAL: Inexact DEcentralized Accelerated Augmented Lagrangian Method
Authors: Yossi Arjevani, Joan Bruna, Bugra Can, Mert Gurbuzbalaban, Stefanie Jegelka, Hongzhou Lin
NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We provide experimental results that demonstrate the effectiveness of the proposed algorithm on highly ill-conditioned problems. |
| Researcher Affiliation | Academia | Yossi Arjevani NYU yossia@nyu.eduJoan Bruna NYU bruna@cims.nyu.eduBugra Can Rutgers University bc600@scarletmail.rutgers.eduMert Gürbüzbalaban Rutgers University mg1366@rutgers.eduStefanie Jegelka MIT stefje@csail.mit.eduHongzhou Lin MIT hongzhou@mit.edu |
| Pseudocode | Yes | Algorithm 1 Decentralized Augmented Lagrangian framework; Algorithm 2 Accelerated Decentralized Augmented Lagrangian framework; Algorithm 3 IDEAL: Inexact Acc-Decentralized Augmented Lagrangian framework |
| Open Source Code | No | The paper does not contain any explicit statements or links indicating that the source code for the proposed methodology is publicly available. |
| Open Datasets | Yes | To facilitate a simple comparison between existing state-of-the-art algorithms, we consider an ℓ2-regularized logistic regression task over two classes of the MNIST [24]/CIFAR-10 [23] benchmark datasets. ... [24] Y. Le Cun, C. Cortes, and C. Burges. Mnist handwritten digit database. ATT Labs [Online], 2, 2010. URL http://yann.lecun.com/exdb/mnist. |
| Dataset Splits | No | The paper mentions using MNIST and CIFAR-10 datasets but does not provide specific details on how these datasets were split into training, validation, or test sets (e.g., percentages, counts, or explicit references to standard splits). |
| Hardware Specification | No | The paper does not provide specific details about the hardware used to run the experiments, such as GPU/CPU models, processor types, or memory. |
| Software Dependencies | No | The paper mentions using logistic regression and convolutional kernel networks but does not provide specific software dependencies (e.g., programming languages, libraries, or frameworks) with version numbers. |
| Experiment Setup | Yes | We set the inner iteration counter to be Tk = 100 for all algorithms, and use the theoretical stepsize schedule. The decentralized environment is modelled in a synthetic setting, where the communication time is steady and no latency is encountered. To demonstrate the effect of the underlying network architecture, we consider: a) a circular graph, where the agents form a cycle; b) a Barbell graph, where the agents are split into two complete subgraphs, connected by a single bridge (shown in Figure 2 in the appendix). |