Bit-Swap: Recursive Bits-Back Coding for Lossless Compression with Hierarchical Latent Variables
Authors: Friso Kingma, Pieter Abbeel, Jonathan Ho
ICML 2019 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Through experiments we verify that Bit Swap results in lossless compression rates that are empirically superior to existing techniques. Our implementation is available at https:// github.com/fhkingma/bitswap. |
| Researcher Affiliation | Academia | 1University of California, Berkeley, California, USA. |
| Pseudocode | Yes | Algorithm 1 BB-ANS for lossless compression with hierarchical latent variables. Algorithm 2 Bit-Swap (ours) for lossless compression with hierarchical latent variables. |
| Open Source Code | Yes | Our implementation is available at https:// github.com/fhkingma/bitswap. |
| Open Datasets | Yes | To compare Bit-Swap against BB-ANS, we use the following image datasets: MNIST, CIFAR-10 and Image Net (32 32). |
| Dataset Splits | No | The paper mentions using "test data compression results" and ELBO values from trained models, but it does not explicitly provide specific train/validation/test split percentages, sample counts, or refer to predefined splits with citations for reproducibility beyond general dataset names. |
| Hardware Specification | No | The paper does not specify the exact hardware (e.g., GPU models, CPU types, memory) used for running the experiments. |
| Software Dependencies | No | The paper mentions implementing models in PyTorch but does not provide specific version numbers for PyTorch or any other software dependencies. |
| Experiment Setup | Yes | We train for 100 epochs with Adam Kingma & Ba (2015) optimizer and a batch size of 64. |