Arjun Karuvally

theory. intelligence. computation. math.

prof_pic.jpg

Postdoctoral Researcher

Salk Institute for Biological Studies

I’m Arjun Karuvally, a researcher passionate about the mathematical and theoretical foundations of artificial intelligence. My work centers on understanding how neural networks store and process memory, especially focusing on recurrent architectures and energy-based models.

I enjoy exploring questions about how intelligence emerges from computation and aim to develop new frameworks that both advance our understanding and improve practical AI design. Much of my research involves bridging gaps between complex mathematical ideas and real-world AI systems.

I’m always eager to learn more, dive into challenging problems, and push the boundaries of what AI can achieve. Feel free to explore the site to obtain insight into my projects, publications, and ongoing research. If you have any comments, thoughts or queries, you can reach me at arjun.k018@gmail.com.

news

Jun 8, 2025 New preprint released! Transient Dynamics in Lattices of Differentiating Ring Oscillators. Check it out at https://arxiv.org/pdf/2506.07253
May 20, 2025 New preprint released! Bridging Expressivity and Scalability with Adaptive Unitary SSMs. Check it out at https://arxiv.org/pdf/2507.05238
Mar 10, 2025 Defended my Phd thesis titled Beyond the Hopfield Memory Theory: Dynamic Energy Landscapes and Traveling Waves in RNNs. Thank you for my advisors Hava T. Siegelmann, Terry Sejnowski, and committee members Cameron Musco and Ina Fiterau.
Jul 21, 2024 Presenting our paper titled Hidden Traveling Waves bind Working Memory Variables in Recurrent Neural Networks at the ICML 2024 https://proceedings.mlr.press/v235/karuvally24a.html
Oct 4, 2023 New preprint released! We applied the Episodic Memory Theory to mechanistically intepret RNNs. We fully describe the behavior of RNNs trained on simple tasks and provide a method to interpret the learned parameters and hidden states. Check it out at https://arxiv.org/abs/2310.02430

latest posts

Oct 18, 2023 Fractals
Sep 14, 2023 Memory and the Energy Paradigm

selected publications

2024

  1. Hidden Traveling Waves bind Working Memory Variables in Recurrent Neural Networks
    Arjun Karuvally, Terrence Sejnowski, and Hava T Siegelmann
    In Proceedings of the 41st International Conference on Machine Learning, 21–27 jul 2024

2023

  1. gsemm.gif
    General Sequential Episodic Memory Model
    Arjun Karuvally, Terrence J. Sejnowski, and Hava T. Siegelmann
    In International Conference on Machine Learning, ICML 2023, 23-29 July 2023, Honolulu, Hawaii, USA, 21–27 jul 2023
  2. emt.gif
    Episodic Memory Theory of Recurrent Neural Networks: Insights into Long-Term Information Storage and Manipulation
    Arjun Karuvally, Peter DelMastro, and Hava T. Siegelmann
    In Annual Workshop on Topology, Algebra, and Geometry in Machine Learning (TAG-ML) at the 40th International Conference on Machine Learning ICML 2023, 21–27 jul 2023