Layered Sampling for Robust Optimization Problems

Authors: Hu Ding, Zixiu Wang

ICML 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this paper, we propose a new variant of coreset technique, layered sampling, to deal with two fundamental robust optimization problems: k-median/means clustering with outliers and linear regression with outliers. ... We observe that these problems can be often efficiently solved by some heuristic algorithms in practice...we compare these two methods in our experiments. ... Due to the space limit, we leave the complete experimental results to our supplement.
Researcher Affiliation Academia Hu Ding 1 Zixiu Wang 1 1School of Computer Science and Technology, University of Science and Technology of China. Correspondence to: Hu Ding <huding@ustc.edu.cn, http://staff.ustc.edu. cn/ huding/ >.
Pseudocode Yes Algorithm 1 LAYERED SAMPLING FOR k-MED-OUTLIER... Algorithm 2 LAYERED SAMPLING FOR LIN1-OUTLIER
Open Source Code No The paper does not provide an explicit statement or link for open-source code for the described methodology.
Open Datasets No The paper mentions generic problem settings like "k-median/means clustering with outliers" and "linear regression with outliers" and refers to "an instance P", but does not name or provide access information for any specific dataset used for training or evaluation.
Dataset Splits No The paper does not provide specific details on dataset splits (e.g., training, validation, test percentages or counts) or reference predefined splits for reproducibility.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments.
Software Dependencies No The paper does not provide specific ancillary software details with version numbers (e.g., libraries, frameworks, or solvers).
Experiment Setup No The paper focuses on the theoretical aspects and algorithms, but does not provide specific experimental setup details such as hyperparameters, optimization settings, or training configurations.