I'm a third-year undergraduate at IIT Kharagpur ('27), specializing in the foundational mathematics and from-scratch implementation of deep learning architectures, particularly Transformers and NLP.
I believe in understanding systems from the ground up, whether it's building a language model from scratch or solving complex algorithmic problems. I'm passionate about dissecting the latest research papers to stay at the forefront of AI.
- I'm currently building "S1", a ~125-150M parameter Transformer-based language model from scratch in PyTorch to solidify my understanding of modern LLM & Deep Learning architectures.
- I’m an incoming SDE Intern @ CorridorPlatforms.
- I previously gained experience in quantitative research at WorldQuant.
- I'm a Specialist on Codeforces and love competitive programming.
- I'm actively studying NLP (including "Speech and Language Processing" by Daniel Jurafsky) and the foundations of deep learning from texts like "Understanding Deep Learning" by Simon J.D. Prince.
- I’m always open to collaborating on challenging open-source AI or systems-level projects.
This stack reflects my focus on both deep learning implementation and full-stack development.
Deep Learning & Model Engineering:


