Permutation Search of Tensor Network Structures via Local Sampling

Authors: Chao Li, Junhua Zeng, Zerui Tao, Qibin Zhao

ICML 2022 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We conduct a theoretical investigation of TN-PS and propose a practically-efficient algorithm to resolve the problem. Theoretically, we prove the counting and metric properties of search spaces of TN-PS... Numerically, we propose a novel meta-heuristic algorithm... Numerical results demonstrate that the new algorithm can reduce the required model size of TNs in extensive benchmarks...
Researcher Affiliation Academia 1 RIKEN Center for Advanced Intelligence Project (RIKEN-AIP), Tokyo, Japan 2 School of Automation, Guangdong University of Technology, Guangzhou, China 3 Tokyo University of Agriculture and Technology, Tokyo, Japan.
Pseudocode Yes Algorithm 1 Random sampling over Id(G)... Algorithm 2 TN-structure Local Sampling (TNLS)
Open Source Code Yes Our code is available at https://github.com/Chao Li At RIKEN/TNLS.
Open Datasets Yes The Combined Cycle Power Plant (CCPP) (Tufekci, 2014) dataset... The MG (Flake & Lawrence, 2002) data... The Protein (Dua & Graff, 2017) data... we randomly select 10 natural images from the BSD500 (Arbelaez et al., 2010).
Dataset Splits No For all the datasets, we randomly choose 80% of the data for training and the rest for testing, then standardize the training and testing sets respectively by removing the mean and scaling to unit variance... No explicit mention of a separate validation split or its size.
Hardware Specification No Part of the computation was carried out at the Riken AIp Deep learning ENvironment (RAIDEN). This refers to a computing environment but does not provide specific hardware details like GPU/CPU models or memory.
Software Dependencies No The paper mentions using the 'Adam optimizer (Kingma & Ba, 2014)' and 'Matlab commands resize and rgb2gray' but does not specify version numbers for these or any other software dependencies.
Experiment Setup Yes In our method, we set the template G0 as a cycle graph, the searching range for TN-ranks R = 7, the maximum iteration #Iter = 30, the number of samples #Sample = 60, and the tuning parameters c1 = 0.9 and c2 = 0.9, 0.94, 0.98... We set the learning rate of the Adam optimizer (Kingma & Ba, 2014) to 0.001. The maximum number of generations is set to be 30. The population in each generation is set to be 150... elimination rate is 36%... There is a chance of 24% for each gene to mutate...