Domain Adaptation as a Problem of Inference on Graphical Models

Authors: Kun Zhang, Mingming Gong, Petar Stojanov, Biwei Huang, QINGSONG LIU, Clark Glymour

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experimental results on both synthetic and real data demonstrate the efficacy of the proposed framework for domain adaptation. The code is available at https://github.com/mgong2/DA_Infer. and 5 Experiments
Researcher Affiliation Collaboration Kun Zhang1 , Mingming Gong2 , Petar Stojanov3 Biwei Huang1, Qingsong Liu4, Clark Glymour1 1 Department of philosophy, Carnegie Mellon University 2 School of Mathematics and Statistics, University of Melbourne 3 Computer Science Department, Carnegie Mellon University, 4 Unisound AI Lab
Pseudocode No No explicit pseudocode or algorithm blocks were found in the paper.
Open Source Code Yes The code is available at https://github.com/mgong2/DA_Infer.
Open Datasets Yes We then perform evaluations on the cross-domain indoor Wi Fi location dataset [59]., We also evaluate our method on the Graft vs. Host Disease Flow Cytommetry dataset (Gv HD) [61]. and Following the experimental setting in [60], we build a multi-source domain dataset by combing four digits datasets, including MNIST, MNIST-M, SVHN, and Synth Digits.
Dataset Splits No The paper mentions sample sizes for source and target domains (e.g., 'sampled 500 points in each source domain and the target domain' or 'sampled 700 points in 10 replicates'), and specifies training and test sample counts for digits datasets ('randomly sample 20,000 labeled images for training... and test on 9,000 examples'). However, it does not explicitly mention validation splits.
Hardware Specification No No specific hardware details (such as GPU/CPU models, memory, or cloud instance types) used for running the experiments were provided in the paper.
Software Dependencies No The paper mentions using Multi-Layer Perceptions (MLPs) and Generative Adversarial Networks (GANs), but does not specify any software names with version numbers for libraries or frameworks used.
Experiment Setup Yes We model each module in the graph with 1-hidden-layer MLPs with 32 nodes. In each replication, we randomly sample the MLP parameters and domain-specific θ values from N(0, I). and We implement our LV-CGAN by using Multi-Layer Perceptions (MLPs) with one hidden layer (32 nodes) to model the function of each module and set the dimension of input noise E and θ involved in each module to 1.