I am a PhD candidate in the Statistics department at UCLA and a
member of the Statistical Machine
Learning Lab led by Quanquan
Gu. I also develop
the deep learning algorithms that power the Chatterbaby
* Theory of deep learning: optimization, generalization, etc.
* Statistical learning theory
* Non-convex optimization
* Applications of deep learning: natural language understanding, audio
* New paper
learning of halfspaces using gradient descent is now on arXiv.
* My single neuron paper
was accepted at NeurIPS 2020.
* I received a Best Reviewer Award for ICML 2020.
* I will be attending the IDEAL
Special Quarter on the Theory of Deep Learning
TTIC/Northwestern for the fall quarter.
* I'm reviewing for AISTATS
* Updated the arXiv version of my recent paper
on agnostic learning of a single neuron with improved bounds.
* I've been awarded a Dissertation
by UCLA's Graduate Division.
* New paper
PAC learning of a single neuron using gradient descent is now on arXiv.
* New paper
accepted at Brain Structure and Function
from work with
researchers at UCLA School of Medicine.
* I'll be (remotely) working at Amazon's Alexa
for the summer as a research intern, working on natural
* I'm reviewing for NeurIPS 2020
* I'm reviewing for ICML 2020
* My paper with Yuan Cao and Quanquan Gu, "Algorithm-dependent
Generalization Bounds for Overparameterized Deep Residual Networks", was
accepted at NeurIPS 2019 (arXiv
I am currently a PhD candidate in the Statistics department at UCLA
and a member of the Statistical
Machine Learning Lab. I am supervised by Ying
Nian Wu from the Department of Statistics and Quanquan
Gu from the Department of Computer Science. I completed my
masters in mathematics at the University of British Columbia,
Vancouver, in May 2015. I was a member of the Probability
Group, and Ed
Perkins was my supervisor. Before that, I completed my
undergraduate degree in mathematics at McGill University in 2013.
You may find more information about me on my CV
(last updated August 2020).
For 2020-2021, I have a UCLA Dissertation Year Fellowship and will
not be teaching.
Past teaching positions:
Spring 2020: Stats 100C, Linear Models with Arash Amini.
Fall 2019: Stats 102C, Monte Carlo Methods with Qing Zhou.
Summer 2016, Session C: Stats 10, Intro Statistics with Juana Sanchez.
Summer 2016, Session A: Stats 10, Intro Statistics with Miles Chen.
Fall 2016: Stats 100A, Introduction to Probability Theory with Ying
Winter 2016: Stats 100B, Introduction to Mathematical Statistics with
Since Fall 2017, I have been working as a graduate student researcher
(GSR) for Ariana Anderson at the Semel Institute for Neuroscience and
Human Behavior. For the 2016-2017 school year, I worked as a GSR for
Ariana Anderson and Monika Mellem.
1. S. Frei
, Y. Cao, and Q. Gu. Agnostic learning of halfspaces
with gradient descent via soft margins. Preprint, arXiv
Refereed Conference Publications
2. S. Frei
, Y. Cao, and Q. Gu. Agnostic
learning of a single neuron with gradient descent. In Neural
Information Processing Systems (NeurIPS)
, 2020. [arxiv]
3. S. Frei
, Y. Cao, and Q. Gu. Algorithm-dependent
generalization bounds for overparameterized deep residual networks. In Neural Information Processing Systems (NeurIPS)
, [camera ready]
4. A.E. Anderson, M. Diaz-Santos, S. Frei et al.
Hemodynamic latency is associated with reduced intelligence across the
lifespan: an fMRI DCM study of aging, cerebrovascular integrity, and
cognitive ability. Brain Structure and Function
, 2020. [link]
5. S. Frei
and E. Perkins. A
lower bound for $p_c$
bond percolation in two and three dimensions.
Electronic Journal of Probability
21(56), 2016. [link]
6. S. Frei
, K. Lockwood, G. Stewart, J. Boyer, and B.S. Tilley. On
thermal resistance in concentric residential geothermal heat
exchangers. Journal of Engineering Mathematics