Department of Applied Physics, Stanford University

Orcid: 0000-0003-2601-7487

My Google Scholar

My Github

I am a recent graduate of the
applied physics department at Stanford University, where I am a theorist in the
Ganguli Lab.
My current research interests focus on using methods from nonequilibrium statistical physics and large deviation theory
to study biological computation, from the micoscopic scale of single receptor computations to macroscopic reinforcement learning. I am interested generally in statistical physics,
computational neuroscience, and machine learning.

My latest paper (with Subhaneil Lahiri) studies thermodynamic limits on sensors modeled as continuous time Markov chains, using stochastic thermodynamics and large deviation theory. We can place interesting bounds on sensors of this type by first deriving a thermodynamic uncertainty relation for densities in subsets of Markov chain states.

With Chris Stock and Sam Ocko, I have also recently been working on a project deriving biologically plausible synaptic update rules that provably preserve the task an RNN is trained to do, while also increasing the network robustness. We can think of this synaptic update rule as traversing a manifold in synaptic weight space of networks that perform exactly the same task, to find the network with the best noise robustness quality.Both of these projects have associated preprints that are listed below.

I have recently defended, and will be starting as a research fellow at the Flatiron Institute in Fall 2022!
Christopher H. Stock, Sarah E. Harvey, Samuel A. Ocko, Surya Ganguli. (preprint 2021). *Synaptic balancing: a biologically plausible local learning rule that provably increases neural network noise robustness without sacrificing task performance.* https://arxiv.org/abs/2107.08530.

Todd Karin, Xiayu Linpeng, M. M. Glazov, M. V. Durnev, E. L. Ivchenko, Sarah Harvey, Ashish K. Rai, Arne Ludwig, Andreas D. Wieck, and Kai-Mei C. Fu. July 2016. *Giant permanent dipole moment of two-dimensional excitons bound to a single stacking fault.* Physical Review B, 94(4). doi:10.1103/physrevb.94.041201. (link)

University of Washington B.S., Physics and Astronomy, *summa cum laude*, 2015

Stanford University Ph.D., Applied Physics, 2022

- Invited talk at BIRS conference Mathematical Models in Biology: from Information Theory to Thermodynamics (Online, July 2020) video
- APS 2020 abstract and presentation slides here
- Bernstein 2019: A local synaptic balancing rule for homeostatic plasticity (poster)

*Winter 2019:*NBIO 228 (Mathematical Tools for Neuroscience). Developed curriculum, lectures, and homework assignments for the core mathematical methods class required for first year neuroscience PhD students at Stanford. Notes*Summer 2021:*Methods in Computational Neuroscience Research Facilitator. Summer school website: https://www.mbl.edu/mcn/

- 2021 Beg Rohu Summer School Participant (poster)
- 2018 Center for Mind, Brain, Computation and Technology Graduate Trainee
- 2017 Methods in Computational Neuroscience Summer School student, Woods Hole, MA.
- 2016 National Defense Science and Engineering Graduate Fellowship (NDSEG) recipient
- 2015 Stanford Graduate Fellowship, William R. Hewlett Fellow
- 2014 Mary L. Boas Endowed Scholarship in Physics
- 2010 - 2014 Washington NASA Space Grant Scholar
- 2010 - 2014 Mary Gates Endowment for Students
- Phi Beta Kappa honors society, Sigma Pi Sigma physics honors society

Ham radio: KI7UXI

Drawing

US mail: Sarah Harvey ChEM-H / Neuro research complex 290 Jane Stanford Way Stanford, CA 94305 Email: harveys@stanford.edu Twitter: @SarahLizHarvey

Sarah Harvey < harveys@stanford.edu> Last modified: 10/21