Primal and Dual Analysis of Entropic Fictitious Play for Finite-sum Problems
Authors: Atsushi Nitanda, Kazusato Oko, Denny Wu, Nobuhito Takenouchi, Taiji Suzuki
ICML 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | We illustrate the efficiency of our novel implementation in experiments including neural network optimization and image synthesis. |
| Researcher Affiliation | Academia | 1Kyushu Institute of Technology 2Center for Advanced Intelligence Project 3University of Tokyo 4University of Toronto 5Vector Institute for Artificial intelligence. |
| Pseudocode | Yes | Algorithm 1 Discrete-time Entropic Fictitious Play; Algorithm 2 Efficient Implementation of EFP for Finite-sum Problem |
| Open Source Code | No | The paper does not provide concrete access to source code for the methodology described. |
| Open Datasets | No | The paper mentions generating data for a student-teacher setting and using the Mona Lisa image as a target, but does not provide specific access information (link, DOI, formal citation) for any publicly available datasets used for training. |
| Dataset Splits | No | The paper does not provide specific dataset split information for training, validation, or testing. |
| Hardware Specification | No | The paper does not provide specific hardware details (e.g., exact GPU/CPU models, memory amounts) used for running its experiments. |
| Software Dependencies | No | The paper does not provide specific ancillary software details with version numbers. |
| Experiment Setup | Yes | We optimize the neural network using EFP with an outerloop step size ηγ = 0.01. At each iteration, we approximate the proximal Gibbs measure ˆµt via the Langevin Monte Carlo algorithm with step size η = 0.01. (Section 7.1) We run Algorithm 2 with λ = 10 5, λ = 10 4, T = 2000, S = 10, m = 1000, η γ = 0.01 to fit the target image. As for the step size for Langevin Monte Carlo, we used cosine annealing from 0.1 to 0.01. (Figure 3 caption) |