Financial Engineering Seminar: Model Invariants and Functional Regularization in Machine Learning by Harvey Stein
Speaker: Harvey Stein (Two Sigma)
Title: Model Invariants and Functional Regularization in Machine Learning.
Abstract
When modeling data, we would like to know that our models are extracting facts about the data itself, and not about something arbitrary, like the order of the factors used in the modeling. Formally speaking, this means that we want the model to be invariant under respect to certain transformations. Here we look at different models and the nature of their invariants. As is commonly known, regression, MLE and Bayesian estimation all are invariant with respect to linear transformations, whereas regularized regressions have a far more limited set of invariants. As a result, regularized regressions produce results that are less about the data itself and more about how it is parameterized. To regularize machine learning models without losing invariants, we propose an alternative expression of regularization which we call functional regularization. Ridge regression and lasso are special cases of functional regularization, as is Bayesian estimation. But functional regularization preserves model invariance, whereas ridge and lasso do not. It is also more flexible, easier to understand, and can even be applied to non-parametric models.
Biography
Dr. Harvey Stein is a senior VP in the Labs group at Two Sigma. From 1993 to 2022, he was at Bloomberg, where he served as the head of several departments. He also teaches risk management at Columbia University and is on the board of directors of IAQF. Dr. Stein has published and lectured on credit risk modeling, financial regulation, interest rate and FX modeling, CVA calculations, mortgage backed security valuation, COVID-19 data analysis and other subjects. Dr. Stein holds a Ph.D. in Mathematics from the University of California, Berkeley (1991) and a B.S. in Mathematics from Worcester Polytechnic Institute (1982).