Matrix Completion with Model-free Weighting
Authors: Jiayi Wang, Raymond K. W. Wong, Xiaojun Mao, Kwun Chuen Gary Chan
ICML 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Numerical experiments are also provided to demonstrate the effectiveness of the proposed method. (Abstract) and 6. Simulations and 7. Real Data Applications |
| Researcher Affiliation | Academia | 1Department of Statistics, Texas A&M University, College Station, TX 77843, USA 2School of Data Science, Fudan University, Shanghai, 200433, China 3Department of Biostatistics, University of Washington, Seattle, WA 98195, USA. |
| Pseudocode | No | The paper refers to extended algorithms and details in the supplemental document (e.g., Section E.1 and E.2), but no pseudocode or algorithm blocks are present in the main text. |
| Open Source Code | No | The paper does not provide an explicit statement or link indicating that the source code for the proposed method is openly available. |
| Open Datasets | Yes | Coat Shopping Dataset, which is available at http://www.cs.cornell.edu/ schnabts/mnar/. ... Yahoo! Webscope Dataset, which is available at http://research.yahoo.com/AcademicRelations. |
| Dataset Splits | Yes | For all methods mentioned above, we randomly separate 20% of the observed entries in every simulated dataset and use it as the validation set to select tuning parameters. (Section 6) and For both datasets, we separate half of the test data set as the validation set to select tuning parameters for all methods. (Section 7) |
| Hardware Specification | No | Portions of this research were conducted with high performance research computing resources provided by Texas A&M University (https://hprc.tamu.edu). This statement is too general and does not provide specific hardware details like GPU/CPU models or memory. |
| Software Dependencies | No | The paper mentions various algorithms and methods (e.g., 'L-BFGS-B algorithm', 'Soft Impute') but does not specify any software names with version numbers for implementation or experimental setup. |
| Experiment Setup | No | The paper describes dataset generation, noise settings, and missing mechanisms, and mentions tuning parameter selection via a validation set, but it does not provide concrete hyperparameter values (e.g., learning rate, batch size, epochs) or detailed optimizer settings for the models. |