Thesis
Bayesian online learning for online portfolio selection
- Abstract:
-
We propose a novel family of Bayesian learning algorithms for online portfolio selection that overcome many of the shortcomings of traditional techniques, including selection bias (the failure to cover a broad universe of assets), data-snooping bias (the risk that a trading strategy's performance on past data is inflated due to hyperparameter overfitting) and a lack of robustness to transaction costs.
As the basis for this novel family, we develop a Bayesian treatment of the online passive-aggressive and gradient descent algorithms, some of the most popular algorithms in the literature. Our approach starts from a probabilistic interpretation of the underlying objective functions and enables uncertainty modelling, probabilistic predictions as well as automatic, data-dependent hyperparameter tuning.
We conclude by testing our proposals on real-world financial data. We further benchmark our framework on a wide range of canonical test problems, over which it achieves a significant improvement on its competitors. Beyond online portfolio selection, our algorithms contribute to the theory of adaptive gradient methods by equipping these with uncertainty estimates and a self-tuning mechanism for the learning rate parameter, which constitutes a major milestone in the area of Bayesian inference for neural networks.
Actions
Authors
Contributors
- Role:
- Supervisor
- Institution:
- University of Oxford
- Division:
- MPLS
- Department:
- Engineering Science
- Role:
- Supervisor
- ORCID:
- 0000-0003-1959-012X
- Funder identifier:
- http://dx.doi.org/10.13039/501100001866
- Grant:
- 8837255
- Programme:
- AFR PhD
- Funder identifier:
- http://dx.doi.org/10.13039/501100000269
- Programme:
- Quantitative Finance stipend
- Type of award:
- DPhil
- Level of award:
- Doctoral
- Awarding institution:
- University of Oxford
- Language:
-
English
- Deposit date:
-
2022-09-08
Terms of use
- Copyright holder:
- Salas, A
- Copyright date:
- 2020
If you are the owner of this record, you can report an update to it here: Report update to this record