Heuristic Search for Approximating One Matrix in Terms of Another Matrix
Authors: Guihong Wan, Haim Schweitzer
IJCAI 2021 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experimental results show that our sub-optimal algorithms compare favorably with the current state-of-the-art greedy algorithms. |
| Researcher Affiliation | Academia | Guihong Wan and Haim Schweitzer Department of Computer Science, The University of Texas at Dallas Guihong.Wan@utdallas.edu, HSchweitzer@utdallas.edu |
| Pseudocode | Yes | Algorithm 1 The Search Algorithm. |
| Open Source Code | No | The paper states 'Other implementations are publicly available,' referring to existing algorithms, but does not provide a statement or link for the authors' own source code. |
| Open Datasets | Yes | We describe experiments on various datasets that are publicly available. For the N=1 case we compare the proposed algorithm with the following methods: Leaps [Furnival and Wilson, 1974]; Forward [Hastie et al., 2009]; Backward [Hastie et al., 2009]; OMP [Mallat, 1999]; Fo Ba [Zhang, 2009]; POSS [Qian et al., 2015]. For the general case (N 1) we compare our algorithm with the following algorithms: SOMP [Tropp et al., 2006]; SSBR [Belmerhnia et al., 2014]; SOLS [Chen and Huo, 2006]; CM [Civril and Magdon Ismail, 2012]; ISOLS [Maung and Schweitzer, 2015]. The results for SOLS, CM and ISOLS are same. (They are different in terms of runtime.) The results for ISOLS are shown. The implementations used for Leaps, Forward, and Backward are the functions in the R library leaps . SOMP and SSBR are implemented in Python. Other implementations are publicly available. Experiments are conducted on i Mac with Processor Intel Quad-Core i7 and Memory 32GB. |
| Dataset Splits | No | The paper mentions 'The first three datasets are split evenly and experimented without intercept' but does not provide specific percentages, sample counts, or explicit mention of validation sets for dataset splits. |
| Hardware Specification | Yes | Experiments are conducted on i Mac with Processor Intel Quad-Core i7 and Memory 32GB. |
| Software Dependencies | No | The paper mentions software like 'R library leaps' and 'Python' but does not provide specific version numbers for any software, libraries, or programming languages. |
| Experiment Setup | No | The paper mentions 'The first three datasets are split evenly and experimented without intercept' and 'The remaining datasets are real multi-target datasets tested with intercept,' which are high-level setup details. However, it does not provide specific hyperparameter values (e.g., learning rate, batch size) or detailed optimizer settings necessary for full reproducibility. |