Nahom Seyoum
AboutResearchCourseworkReading

Below are a couple things I have worked on. Happy to chat about any of them.

Probability and ML

  • •Grokking Interpretability: Understanding Grokking in Deep Neural Networks
  • •Inequalities for Independent Vectors under Sublinear Expectation
  • •Singular Learning Theory: Generalization in Deep Networks

Optimization

  • •On the Convergence of Online Mirror Descent(OMD) and Stochastic Mirror Descent(SMD)
  • •Non-convex Optimization via Sampling
  • •Approximation Algorithms for Unconstrained Submodular Maximization

Other

  • •Rational Binomial Coefficients and Denominators