Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].
Towards Lower Bounds on the Depth of ReLU Neural Networks
Authors: Christoph Hertrich, Amitabh Basu, Marco Di Summa, Martin Skutella
NeurIPS 2021 | Venue PDF | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Using techniques from mixed-integer optimization, polyhedral theory, and tropical geometry, we provide a mathematical counterbalance to the universal approximation theorems... and In Section 2, we resolve Conjecture 1.2 for k = 2, under a natural assumption on the breakpoints of the function represented by any intermediate neuron. We achieve this result by leveraging techniques from mixed-integer programming to analyze the set of functions computable by certain NNs. This provides a computational proof of Theorem 2.5. |
| Researcher Affiliation | Academia | Christoph Hertrich Technische Universität Berlin Berlin, Germany EMAIL, Amitabh Basu Johns Hopkins University Baltimore, USA EMAIL, Marco Di Summa Università degli Studi di Padova Padua, Italy EMAIL, Martin Skutella Technische Universität Berlin Berlin, Germany EMAIL |
| Pseudocode | No | No structured pseudocode or algorithm blocks were found in the paper. |
| Open Source Code | No | The paper does not provide an explicit statement or link for the open-source code for the methodology described. |
| Open Datasets | No | This is a theoretical paper that does not involve experimental training on datasets. |
| Dataset Splits | No | This is a theoretical paper and does not involve dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper mentions a 'computer-aided proof' and solving a Mixed-Integer Program (MIP) but does not provide specific hardware details (e.g., CPU, GPU models, or cloud computing specifications) used for these computations. |
| Software Dependencies | Yes | Gurobi Optimization, LLC. Gurobi optimizer reference manual, 2021. and The Sage Developers. Sage Math, the Sage Mathematics Software System (Version 9.0), 2020. |
| Experiment Setup | No | This is a theoretical paper that performs a computer-aided proof involving a Mixed-Integer Program (MIP), but it does not describe specific experimental setup details such as hyperparameters or training configurations typically found in empirical machine learning studies. |