Deep Archimedean Copulas

Authors: Chun Kai Ling, Fei Fang, J. Zico Kolter

NeurIPS 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Our experiments show that ACNet is able to both approximate common Archimedean Copulas and generate new copulas which may provide better fits to data.Empirical results show that ACNet is able to learn standard copula with little to no hyperparameter tuning. When tested on real-world data, we observed that ACNet was able to learn new generators which are a better qualitative description of observed data compared to commonly used Archimedean copulas. Lastly, we demonstrate the effectiveness of ACNet in situations where measurements are uncertain within known boundaries. This task is challenging for methods which learn densities as evaluating probabilities would then involve the costly numerical integration of densities.
Researcher Affiliation Academia Chun Kai Ling Computer Science Dept. Carnegie Mellon University chunkail@cs.cmu.edu Fei Fang Institute for Software Research Carnegie Mellon University feif@cmu.edu J. Zico Kolter Computer Science Dept. Carnegie Mellon University Bosch Center for AI zkolter@cs.cmu.edu
Pseudocode No The paper does not include a pseudocode block or a clearly labeled algorithm.
Open Source Code Yes The source code for this paper may be found at https://github.com/lingchunkai/ACNet.
Open Datasets Yes To verify that ACNet is able to learn commonly used Archimedean copulas, we generate synthetic datasets from the Clayton, Frank and Joe copulas. These copulas exhibit different tail dependencies (see Figure 2a). ... For each copula, we generate 2000 train and 1000 test points and train ACNet for 40k epochs. ... To demonstrate the efficacy of ACNet, we applied ACNet to 3 real-world datasets. ... Boston Housing. ... (INTC-MSFT) ... (GOOG-FB). We collected daily closing prices of Google (GOOG) and Facebook (FB) from May 2015 to May 2020. The data was collected using Yahoo Finance and comprises 1259 samples. ... We use the GAS dataset [34], which comprises readings from chemical sensors used in simulations for drift compensation.
Dataset Splits Yes For each copula, we generate 2000 train and 1000 test points and train ACNet for 40k epochs. ... Train and test sets are split based on a 3:1 ratio. We normalize both train and test sets independently. ... We use the same synthetic dataset of Section 4.1. For each datapoint, instead of observing the tuple (u1, u2), we observe u1, u1 , u2, u2 , where ui Ui ui. The upper and lower bounds of ui are chosen randomly such that ui ui and ui ui are uniformly chosen from [0, λ], where λ is a noise parameter associated with the experiment.
Hardware Specification Yes Experiments are conducted on a 3.1 GHz Intel Core i5 with 16 GB of RAM.
Software Dependencies Yes Our implementation employs Py Torch [28] for automatic differentiation. We utilize the Py Torch [28] framework for automatic differentiation. We use double precision arithmetic as the inversion of ϕ requires numerical precision.
Experiment Setup Yes For all our experiments we use ACNet with L = 2 and H1 = H2 = 10, i.e., 2 hidden layers each of width 10. The network is small but sufficient for our purpose since {ϕnn} is only 1-dimensional. ΦA and ΦB were initialized in the range [0, 1] and (0, 2) uniformly at random. We use stochastic gradient descent with a learning rate of 1e 5, momentum of 0.9, and a batch size of 200. No hyperparameter tuning was performed.