Contrastive Learning from Pairwise Measurements
Authors: Yi Chen, Zhuoran Yang, Yuchen Xie, Zhaoran Wang
NeurIPS 2018 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We provide numerical experiments to corroborate our theory. We lay out the simulation results in this section and demonstrate the accuracy of the contrastive estimator bΘ stated in Theorem 4.4. In Figure 1. (a)-(b), we plot the rescaled estimation error 1/d bΘ Θ F against 1/ n, where n is the sample size. |
| Researcher Affiliation | Academia | Northwestern University Princeton University |
| Pseudocode | No | The paper describes the theoretical framework and estimator but does not include any structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any explicit statements or links indicating the availability of open-source code for the described methodology. |
| Open Datasets | No | The paper describes generating its own synthetic data for numerical experiments ('we first generate two matrices U Rd r and V Rr d, whose entries are independently and identically distributed standard normal random variables.') and does not provide concrete access information for a publicly available dataset. |
| Dataset Splits | No | The paper describes generating synthetic data for simulations but does not specify any training, validation, or test dataset splits. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., GPU/CPU models, memory) used for running its experiments. |
| Software Dependencies | No | The paper mentions using the 'proximal gradient method' but does not specify any software dependencies with version numbers (e.g., Python version, library names and versions). |
| Experiment Setup | Yes | For the regularization parameter λ, we set λ in (3.2) as 0.5 1/ nd log(2d), as suggested in Theorem 4.4. |