Notice: The reproducibility variables underlying each score are classified using an automated LLM-based pipeline, validated against a manually labeled dataset. LLM-based classification introduces uncertainty and potential bias; scores should be interpreted as estimates. Full accuracy metrics and methodology are described in [1].

Robust Noise Attenuation via Adaptive Pooling of Transformer Outputs

Authors: Greyson Brothers

ICML 2025 | Venue PDF | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our theoretical results are first validated by supervised experiments on a synthetic dataset designed to isolate the SNR problem, then generalized to standard relational reasoning, multi-agent reinforcement learning, and vision benchmarks with noisy observations, where transformers with adaptive pooling display superior robustness across tasks.
Researcher Affiliation Academia 1Johns Hopkins University Applied Physics Laboratory, Maryland, USA. Correspondence to: Greyson Brothers <EMAIL>.
Pseudocode No The paper describes methods, theorems, and proofs using mathematical notation and textual explanations, but it does not include any clearly labeled pseudocode or algorithm blocks.
Open Source Code Yes Code is publicly available at https://github.com/agbrothers/pooling.
Open Datasets Yes We use the standard Multi-Particle Environment (MPE) benchmark. ... Box World is a vision-based relational reasoning task introduced by Zambaldi et al. (2019). ... We conducted additional studies on image classification using the CIFAR 10 and 100 benchmark datasets.
Dataset Splits Yes For each of these pairs, we use 5-fold cross-validation with a holdout test set of 100k samples. ... All experiments trained vision transformers (Vi T) from scratch on the CIFAR dataset using 5-fold cross-validation.
Hardware Specification No The paper does not provide specific hardware details such as exact GPU/CPU models, processor types, or memory amounts used for running the experiments.
Software Dependencies Yes Our dataset was generated using Num Py version 2.0.2 with a seed of 42.
Experiment Setup Yes Additional hyperparameters are listed in the appendix C.2. ... Additional training hyperparameters can be found in C.4. ... Training hyperparameters and network architecture details are outlined in Table 16.