Learning Bound for Parameter Transfer Learning
Authors: Wataru Kumagai
NeurIPS 2016 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Theoretical | Then, we introduce the notion of the local stability and parameter transfer learnability of parametric feature mapping, and thereby derive a learning bound for parameter transfer algorithms. ... In this paper, we also provide the first theoretical learning bound for self-taught learning. |
| Researcher Affiliation | Academia | Wataru Kumagai Faculty of Engineering Kanagawa University kumagai@kanagawa-u.ac.jp |
| Pseudocode | No | The paper presents theoretical formulations, theorems, and proofs, but does not include any pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not mention providing open-source code for the described methodology. |
| Open Datasets | No | The paper is theoretical and focuses on deriving learning bounds; it does not describe experimental evaluation on a public dataset. |
| Dataset Splits | No | The paper is theoretical and does not describe experimental setups or dataset splits for training, validation, or testing. |
| Hardware Specification | No | The paper is theoretical and does not describe any computational experiments, thus no hardware specifications are mentioned. |
| Software Dependencies | No | The paper is theoretical and does not describe any computational experiments that would require specific software dependencies with version numbers. |
| Experiment Setup | No | The paper is theoretical and does not describe any experimental setup details such as hyperparameters or training configurations. |