Synthesizing Action Sequences for Modifying Model Decisions

Authors: Goutham Ramakrishnan, Yun Chan Lee, Aws Albarghouthi5462-5469

AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We implement our approach and apply it to a number of neural networks learned from popular datasets. Our results demonstrate the effectiveness of our approach, the benefits of our algorithmic decisions, and the robustness of the synthesized action sequences to noise. For exploring questions Q1-3, we consider three popular datasets: The German Credit Data (Dua and Graff 2017) and the Fannie Mae Single Family Loan Performance (Mae 2014) datasets have to do with evaluating loan applications high or low risk. The Adult Dataset (Dua and Graff 2017) predicts income as high or low the envisioned use case is it can be used to set salaries.
Researcher Affiliation Academia Goutham Ramakrishnan, Yun Chan Lee, Aws Albarghouthi University of Wisconsin Madison {gouthamr, aws}@cs.wisc.edu, yunchan.c.lee@gmail.com
Pseudocode Yes Algorithm 1 Full synthesis algorithm 1: function SYNTHESIZE(model f, instance x, actions A)
Open Source Code Yes Our algorithm is implemented4 in Python 3, using Tensor Flow (Abadi et al. 2016). 4https://github.com/goutham7r/synth-action-seq
Open Datasets Yes For exploring questions Q1-3, we consider three popular datasets: The German Credit Data (Dua and Graff 2017) and the Fannie Mae Single Family Loan Performance (Mae 2014) datasets have to do with evaluating loan applications high or low risk. The Adult Dataset (Dua and Graff 2017) predicts income as high or low the envisioned use case is it can be used to set salaries.
Dataset Splits No No explicit details about training/validation/test dataset splits for their own model training are provided. The paper mentions using 'test sets' to select instances for their algorithm but doesn't specify the splits used to train the deep neural networks.
Hardware Specification No For fast experimentation, we implemented a brute-force version of Algorithm 1 where all sequences up to some length n are optimized in parallel using AWS Lambda i.e., each sequence is optimized as a separate Lambda. No specific hardware models (GPU/CPU) are mentioned.
Software Dependencies No No specific version numbers for software dependencies are provided. The paper states: 'Our algorithm is implemented4 in Python 3, using Tensor Flow (Abadi et al. 2016). The Adam Optimizer (Kingma and Ba 2014) is used to solve optimization Problem 3.'
Experiment Setup Yes Our algorithm assumes a differentiable model, e.g., a deep neural network, of the form f : Rm {0, 1}, as well as differentiable actions, cost functions and preconditions. In practice, we perform an adaptive search for the best value of the hyperparameter c as we solve the optimization problem: at a variable length interval t of minimization steps, we determine how close the search is to the decision boundary and adjust c and t accordingly. For our primary models, we make our algorithm consider all sequences of length 4. Specifically, for each synthesized solution sequence and each parameter r in the sequence, we uniformly sample values from the interval [(1 θ)r, (1 + θ)r], where θ denotes the maximum percentage change.