Below are a couple things I have worked on. Happy to chat about any of them.
Probability and ML
- •The Parisi Solution of the Sherrington–Kirkpatrick Model
- •Grokking Interpretability: Understanding Grokking in Deep Neural Networks
- •Exponential Concentration Inequalities for Independent Random Vectors under Sublinear Expectations
- •Inequalities for Independent Vectors under Sublinear Expectation
- •Singular Learning Theory: Generalization in Deep Networks