Generator Assisted Mixture of Experts for Feature Acquisition in Batch
Authors: Vedang Asgaonkar, Aditya Jain , Abir De
AAAI 2024 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Experiments with four datasets show that our approach outperforms these methods in terms of trade-off between accuracy and feature acquisition cost. |
| Researcher Affiliation | Academia | Indian Institute of Technology Bombay {vedang, adityajainjhs, abir}@cse.iitb.ac.in |
| Pseudocode | Yes | Algorithm 1: Training. Algorithm 2: Inference. |
| Open Source Code | Yes | Our code is in https://github.com/Vedang Asgaonkar/genex |
| Open Datasets | Yes | We experiment with four datasets for the classification task; DP (disease prediction), MNIST, CIFAR100 and Tiny Imagenet (TI). Details are provided in the extended version (Asgaonkar, Jain, and De 2023). (Extended version of current paper). ar Xiv preprint arxiv:2312.12574. |
| Dataset Splits | Yes | We split the entire dataset in 70% training, 10% validation and 20% test set. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., CPU, GPU models, memory, or cloud instance types) used for running the experiments. |
| Software Dependencies | No | The paper mentions software components like "Python", "PyTorch", "β-VAE", "Wide Resnet", and "Efficient Net" but does not provide specific version numbers for any of them. |
| Experiment Setup | Yes | We set the number of buckets B = 8, 8, 4, 4 for DP, MNIST, CIFAR100 and Tiny Imagenet using cross validation. given a budget qmax for maximum number of oracle queries for each instance. where λ is a hyperparameter. |