Gated Neural Networks for Option Pricing: Rationality by Design

Authors: Yongxin Yang, Yu Zheng, Timothy Hospedales

AAAI 2017 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental Experiments on S&P 500 index options show that our approach is significantly better than others.
Researcher Affiliation Academia Yongxin Yang, Yu Zheng, Timothy M. Hospedales EECS, Queen Mary, University of London , Imperial Business School, Imperial College London yongxin.yang@qmul.ac.uk, t.hospedales@qmul.ac.uk, y.zheng12@imperial.ac.uk
Pseudocode No The paper describes the neural network architecture and mathematical formulas, but does not include any clearly labeled pseudocode or algorithm blocks.
Open Source Code No The paper states in a footnote: '1We release the code of these methods in Github: github.com/arraystream/fft-option-pricing'. This refers to the code for the baseline econometric methods, not the authors' proposed neural network method.
Open Datasets No The option data for S&P500 index comes from Option Metrics and Bloomberg, which provide historical End-of-Day bid and ask quotes. These are commercial data providers, and no public link or specific citation for public access to the dataset is provided.
Dataset Splits No The paper states: 'we train a model with five continuous trading days data, and use the following one day for testing.' This describes the training and testing split, but no explicit mention of a separate validation set for hyperparameter tuning is made.
Hardware Specification No The paper does not provide any specific details about the hardware (e.g., CPU, GPU models, memory) used to run the experiments.
Software Dependencies No The paper mentions 'Adam Optimiser (Kingma and Ba 2015)' as the optimizer used, but does not specify its version number or any other software dependencies with their versions.
Experiment Setup Yes For Single and PSSF, the number of hidden layer neurons is J = 5. The number of pricing models in Multi is I = 9 as MNN has this setting. The number of neurons in hidden layer for the right-branch weighting network of Multi is K = 5.