Index Tracking with Cardinality Constraints: A Stochastic Neural Networks Approach
Authors: Yu Zheng, Bowei Chen, Timothy M. Hospedales, Yongxin Yang1242-1249
AAAI 2020 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | The proposed approach is examined with S&P 500 index data for more than 10 years and compared with widely used index tracking approaches such as forward and backward selection and the largest market capitalisation methods. The empirical results show our model achieves excellent performance. |
| Researcher Affiliation | Collaboration | Yu Zheng,1,2 Bowei Chen,3 Timothy M. Hospedales,4 Yongxin Yang2,4 1Southwestern University of Finance and Economics 2Array Stream Technologies Limited 3University of Glasgow 4University of Edinburgh |
| Pseudocode | No | The paper does not contain any explicitly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | The paper does not provide any statement about releasing source code or include a link to a code repository. |
| Open Datasets | No | The pricing data is obtained from the Center for Research in Security Prices (CRSP), which is known to be the most accurate data for study. While CRSP is a recognized data source, the paper does not provide a direct link, DOI, or a formal citation with author and year that would allow public access to the specific dataset used. |
| Dataset Splits | No | The paper describes a sliding window backtesting approach for training and evaluating the model over time, but it does not specify a separate validation set used for hyperparameter tuning or model selection in the traditional sense of a train/validation/test split. |
| Hardware Specification | No | The paper does not provide specific details about the hardware (e.g., CPU, GPU models, memory) used for running the experiments. It only generally mentions neural networks and related concepts which can imply GPU usage, but no specific hardware is listed. |
| Software Dependencies | No | The paper does not specify any software dependencies with version numbers (e.g., Python, TensorFlow, PyTorch versions) used in the experiments. |
| Experiment Setup | Yes | To guarantee the converge, we introduce an annealing process that progressively reduces the randomness of sampling, such that, for each of the K selections, only the one with the highest probability will be chosen in the end. This can be realised by adding a temperature term into the reparametrisation of πi,j in Eq. 6: πi,j(τ) = exp(Si,j/τ) / ∑j exp(Si,j/τ) where τ = 0.1/ log(e + t) and t is the iteration index. |