Expressivity of ReLU-Networks under Convex Relaxations
Authors: Maximilian Baader, Mark Niklas Mueller, Yuhao Mao, Martin Vechev
ICLR 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | In this work, we prove the following key results: |
| Researcher Affiliation | Academia | Department of Computer Science ETH Zurich, Switzerland {mbaader,mark.mueller,yuhao.mao,martin.vechev}@inf.ethz.ch |
| Pseudocode | No | The paper contains mathematical derivations, lemmas, and theorems, but no explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not contain any explicit statements or links regarding the availability of open-source code for the described methodology. |
| Open Datasets | No | This paper is theoretical, focusing on mathematical proofs and expressivity. It does not conduct experiments using datasets, and thus provides no information about public dataset availability. |
| Dataset Splits | No | As a theoretical paper, it does not involve data splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not describe any experiments that would require specific hardware for execution. |
| Software Dependencies | No | The paper focuses on theoretical concepts and mathematical proofs and does not specify software dependencies with version numbers. |
| Experiment Setup | No | As a theoretical paper, it does not include details on experimental setup, hyperparameters, or training configurations. |