Provably Correct Automatic Sub-Differentiation for Qualified Programs

Authors: Sham M. Kakade, Jason D. Lee

NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Theoretical Our main result shows that, under certain restrictions on our library of nonsmooth functions (standard in nonlinear programming), provably correct generalized subderivatives can be computed at a computational cost that is within a (dimension-free) factor of 6 of the cost of computing the scalar function itself.
Researcher Affiliation Academia Sham M. Kakade University of Washington sham@cs.washington.edu Jason D. Lee University of Southern California jasonlee@marshall.usc.edu
Pseudocode Yes Algorithm 1: Straight Line Program for fpxq; Algorithm 2: The Reverse Mode of AD; Algorithm 3: Program for a Nonsmooth function gpxq; Algorithm 4: Re LU pxq; Algorithm 5: Re LU pxq; Algorithm 6: Automatic Subdifferentiation; Algorithm 7: Overloading the function gpxq; Algorithm 8: σpxq
Open Source Code No The paper does not provide a link to or explicitly state the release of its own source code. It mentions third-party libraries like TensorFlow and PyTorch.
Open Datasets No This is a theoretical paper and does not involve the use of datasets for training.
Dataset Splits No This is a theoretical paper and does not involve dataset splits for validation.
Hardware Specification No The paper is theoretical and discusses computational costs in terms of abstract 'unit runtime cost' rather than specific hardware used for experiments.
Software Dependencies No The paper mentions existing software (TensorFlow, PyTorch) as background examples but does not specify software dependencies with version numbers for its own contributions or implementation.
Experiment Setup No This is a theoretical paper and does not describe any experimental setup details such as hyperparameters or training configurations.