About
Hello! My name is Ishaan Watts, and I am a first-year graduate student in the Machine Learning Department at Carnegie Mellon University. I am interested in the science of foundation models and understanding their failure modes. Currently, I am working with Prof. Aditi Raghunathan on studying how pretraining choices influence downstream performance [ICML'26][Spotlight Talk @ ICBINB Workshop, ICLR'26].

In a past role, I was a Pre-Doctoral Researcher at Google DeepMind under the guidance of Dr. Partha Talukdar. I was part of the Modular Large-Scale Continual Learning group, where I worked on model composition. I also worked as a Research Intern at Microsoft Research under Dr. Sunayana Sitaram, where we studied linguistic diversity in India and worked toward fair and inclusive evaluation of language models [NAACL'24, ACL'24, EMNLP'24, AAAI'25]. I completed my bachelor's degree at IIT Delhi.

Outside of research, I'm a fitness enthusiast and enjoy going to the gym and training for triathlons. I recently completed my first marathon!

Keywords: Large Language Models, Pretraining, Continual Learning, Evaluations

For more details about my background, refer to my Resume. If you'd like to chat about research or triathlon training, feel free to reach out via email!

Last Updated: 05/07/2026
Timeline
• May, 2026: Our work "Sharpness-Aware Pretraining Mitigates Catastrophic Forgetting" has been accepted at ICML 2026. See you in Seoul 🇰🇷!
• April, 2026: I have been invited to give a Spotlight Talk on our work, "Sharpness-Aware Pretraining Mitigates Catastrophic Forgetting", at the ICBINB Workshop @ ICLR 2026 in Rio de Janeiro, Brazil 🇧🇷.
• August, 2025: I have started my MSML program at Carnegie Mellon University and joined Prof. Aditi Raghunathan's lab to work on foundational models.
• July, 2025: I have been awarded the Narotam Sekhsaria ($40,000), J.N. Tata ($20,000) and K.C. Mahindra ($5,800) Scholarships to pursue higher studies at CMU!
• June, 2025: My work at DeepMind was acknowledged in the Gemini 2.5-Pro Technical Report.
• March, 2025: I will be joining the Fall 2025 Masters in Machine Learning (MSML) program at Carnegie Mellon University!
• January, 2025: We have released the cultural artefacts collected in our work "PARIKSHA". [Link]
• December, 2024: Our work "RTP-LX: Can LLMs Evaluate Toxicity in Multilingual Scenarios?" got accepted in the AI for Social Impact track at AAAI 2025 🇺🇸!
• September, 2024: Our work "PARIKSHA: A Large-Scale Investigation of Human-LLM Evaluator Agreement on Multilingual and Multi-Cultural Data" got accepted at EMNLP 2024 🇺🇸!
• September, 2024: Gave an invited talk, titled "Evaluations in Multilingual and Multicultural World", at the National Research Council Canada (NRCC).
Publications

C=Conference, P=Preprint, *=Equal Contribution

[C.5] Sharpness-Aware Pretraining Mitigates Catastrophic Forgetting
Ishaan Watts*, Catherine Li*, Sachin Goyal, Jacob Mitchell Springer, Aditi Raghunathan
Spotlight Talk @ ICBINB Workshop at ICLR 2026
43rd International Conference on Machine Learning
ICML 2026 arXiv ICML Proceedings Poster


[P.1] Gemini 2.5: Pushing the Frontier with Advanced Reasoning, Multimodality, Long Context, and Next Generation Agentic Capabilities
Gemini Team, Google DeepMind (including Ishaan Watts)
Technical Report arXiv Blog


[C.4] RTP-LX: Can LLMs Evaluate Toxicity in Multilingual Scenarios?
Adrian de Wynter, Ishaan Watts, et al.
The 39th Annual AAAI Conference on Artificial Intelligence
AAAI 2025 Artificial Intelligence for Social Impact track arXiv AAAI Proceedings Dataset Poster


[C.3] PARIKSHA: A Large-Scale Investigation of Human-LLM Evaluator Agreement on Multilingual and Multi-Cultural Data
Ishaan Watts, Varun Gumma, Aditya Yadavalli, Vivek Seshadri, Swami Manohar, Sunayana Sitaram
The 2024 Conference on Empirical Methods in Natural Language Processing
EMNLP 2024 arXiv ACL Anthology Dataset Poster Leaderboard


[C.2] MAPLE: Multilingual Evaluation of Parameter Efficient Finetuning of Large Language Models
Divyanshu Aggarwal*, Ashutosh Sathe*, Ishaan Watts, Sunayana Sitaram
The 62nd Annual Meeting of the Association for Computational Linguistics
ACL Findings 2024 arXiv ACL Anthology Poster


[C.1] MEGAVERSE: Benchmarking Large Language Models Across Languages, Modalities, Models and Tasks
Sanchit Ahuja, Divyanshu Aggarwal, Varun Gumma, Ishaan Watts, Ashutosh Sathe, Millicent Ochieng, Rishav Hada, Prachi Jain, Mohamed Ahmed, Kalika Bali, Sunayana Sitaram
2024 Annual Conference of North American Chapter of Association for Computational Linguistics
NAACL 2024 arXiv ACL Anthology Poster

Projects

Shiksha Copilot
An educational assistant built to aid teachers in creating engaging content using Generative AI. Currently deployed in 30+ rural schools across Karnataka, India.

Shiksha Copilot preview

Team: Akshay Nambi, Tanuja Ganu, Kavyansh Chourasia, Srujana Oruganti, Krishna Prasad Srinivasan, Karan Kumar, Ishaan Watts, Yash Gadhia, Somnath Sendhil Kumar, Meena S, Sanchit Gupta
Collaborators: Sikshana Foundation
Media Coverage: Satya Nadella Times of India Indian Express Analytics India

Affiliations
IIT Delhi
2019 - 2023
Microsoft Research
2023 - 2024
Google DeepMind
2024 - 2025
Carnegie Mellon University
2025 - Present