Abstract
Using examples ranging from portfolio construction to algorithmic trading, this talk explains neural networks as a non-parametric econometrics technique. Matthew also provides various examples illustrating the tradeoffs between using Deep Q-learning versus supervised deep learning for predictive modeling with signals such as news sentiment.
Important Considerations
- Traditional Statistical Modeling
- Stats vs Machine Learning
- What Does a Network Classifier Output
- Taxonomy of Most Popular Neural Network Architectures
- Geometric Interpretation of Neural Networks
- Half-Moon Dataset
- Why Deep Learning
Summary
- Neural networks aren't themselves “black-boxes”, although they do treat the data generation process as a black-box The output from neural network classifiers are only probabilities if the features are conditionally independent (or there are enough layers)
- One layer is typically sufficient to capture the non-linearity in most financial applications (but multiple layers are needed for probabilistic output)
- Recurrent neural networks are non-parametric, non-linear, extensions of classical time series methods
- TensorFlow doesn’t check that fitted Recurrent Neural Networks are stationary
Access the full video and slides
Request access to the full video and slides of the session "The Neural Networks Survival Kit for Quants" held at the RavenPack Research Symposium in New York City in September 2018.
Request Event Materials