Understanding VAEs in Fisher-Shannon Plane
Authors: Huangjie Zheng, Jiangchao Yao, Ya Zhang, Ivor W. Tsang, Jia Wang5917-5924
AAAI 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Through extensive qualitative and quantitative experiments, we provide with a better comprehension of VAEs in tasks such as high-resolution reconstruction, and representation learning in the perspective of Fisher information and Shannon information. |
| Researcher Affiliation | Academia | Huangjie Zheng,1 Jiangchao Yao,1,2 Ya Zhang,1 Ivor W. Tsang,2 Jia Wang1 1Cooperative Medianet Innovation Center, Shanghai Jiao Tong University, 2University of Technology Sydney |
| Pseudocode | No | The paper does not contain structured pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide a statement or link indicating the open-sourcing of the code for the methodology described. |
| Open Datasets | Yes | The experiments are conducted on the MNIST dataset (Lecun et al. 1998) and the SVHN dataset (Netzer et al. 2011). |
| Dataset Splits | Yes | We follow the original partition to split the data as 50,000/10,000/10,000 for the training, validation and test. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., CPU/GPU models, memory) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific software dependencies with version numbers. |
| Experiment Setup | Yes | For the architecture of inference network and generative network, we both deploy a 5-layers network. Since the impacts of fully-connected and convolution architecture do not differ much in the experiments, we here present results using the architecture as 5 full-connected layers of dimension 300. The latent code is of dimension 40. |