CosNet: A Generalized Spectral Kernel Network

Authors: Yanfang Xue, Pengfei Fang, Jinyue Tian, Shipeng Zhu, hui xue

NeurIPS 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental In this section, we first introduce the implementation details containing comparison methods and evaluation datasets. Then we conduct systematical experiments to demonstrate the superiority of the proposed Cos Net, especially on the time series classification task.
Researcher Affiliation Academia 1School of Computer Science and Engineering, Southeast University, Nanjing, 210096, China 2Key Laboratory of New Generation Artificial Intelligence Technology and Its Interdisciplinary Applications (Southeast University), Ministry of Education, China {230218795, fangpengfei, 220222083, shipengzhu, hxue}@seu.edu.cn
Pseudocode No The paper does not contain structured pseudocode or algorithm blocks.
Open Source Code No The paper does not provide concrete access to source code (specific repository link, explicit code release statement, or code in supplementary materials) for the methodology described in this paper.
Open Datasets Yes To systematically evaluate the performance of our Cos Net, we conduct comparison experiments on several typical time-series datasets, including 12 sub-datasets with default training and testing data splitting from the UCR Archive Dau et al. [2019] dataset for the classification task and 3 UCI Blake [1998] localization datasets for regression task.
Dataset Splits No The paper mentions 'default training and testing data splitting' for the datasets, but does not explicitly provide specific details for a separate validation dataset split (percentages, counts, or explicit methodology) in the main text.
Hardware Specification Yes All the experiments are implemented with Py Torch Paszke et al. [2019] and conducted on a workstation with NVIDIA RTX 3090 GPU, AMD R7-5700X 3.40GHz 8-core CPU, and 32 GB memory.
Software Dependencies Yes All the experiments are implemented with Py Torch Paszke et al. [2019]
Experiment Setup Yes Each method is trained by ADAM Kingma and Ba [2014] using crossentropy loss for the classification task and L2 loss for the regression task. The learning rate equals 0.01, and the weight matrix is initialized from a normal distribution N(0, 0.01). Each model contains five layers, including the input layer, the output layer, and three hidden layers.