Purdue University · Computer Science

Mohnish Harwani

Researching training algorithms & model architectures that improve large-scale deep neural networks. Also researching ways to accelerate science with AI.

Resume available upon request

Mohnish Harwani
Actively seeking AI research positions (May 2026) or PhD positions (Fall 2026)
👤

About

I'm a graduating senior at Purdue University majoring in Computer Science. My research focuses on developing better training algorithms for large neural networks, and leveraging AI to accelerate scientific research.


Over the past 2+ years, I've been working with nanoHUB to develop agents that perform autonomous research by leveraging software tools, public data repositories, and HPC clusters. We expect our Autonomous Universal Research Assistant (AURA) to be available to the public soon.


I've also been exploring meta-learned neural network optimizers that can extract more generalizable and otherwise useful information during backpropagation.


Feel free to email me if you're interested in my work, hiring, or would just like to connect!

📄

Latest News

Feb
2026
🎤 Talk
AURA: Agentic AI meets nanoHUB's FAIR Workflows and Data
Materials Research and Data Alliance (MaRDA)
Living in the Material World
Feb
2026
🎤 Talk
Leveraging Public Simulation Data to Accelerate Alloy Discovery and Optimization
Materials Research and Data Alliance (MaRDA)
Living in the Material World
Jan
2026
📄 Preprint
Autonomous Universal Research Assistant (AURA): Agentic AI meets nanoHUB's FAIR Workflows and Data
ChemRxiv
J.C. Verduzco, M. Harwani, D. Mejia, A. Strachan
View preprint
Nov
2025
🎤 Talk
Integrating AI into nanoHUB: Toward Intelligent and Connected Scientific Workflows
Google AI Summit
Indianapolis, IN
Poster PDF
Jun
2025
🎤 Talk
nanoRA: A Multi-Agent Research Assistant for Autonomous Scientific Workflows on nanoHUB
AI for Materials Science (AIMS) Workshop
Washington, DC
Poster PDF
May
2025
💼 Internship
Software Engineering Intern, iOS Development
Kaiser Permanente
On-device health calendar using Apple Foundation Models & HealthKit
Jan
2025
📄 Publication
Accelerating Active Learning Materials Discovery with FAIR Data and Workflows
Computational Materials Science — Elsevier
M. Harwani, J.C. Verduzco, B.H. Lee, A. Strachan
View paper
Dec
2024
🎤 Talk
The Impacts of Derivative-Free Optimization on Generalization of Neural Networks
RL and Out-of-Distribution ML
Purdue University
📄

Publications & Preprints

Jan 2026
Autonomous Universal Research Assistant (AURA): Agentic AI meets nanoHUB's FAIR Workflows and Data
ChemRxiv Preprint
J.C. Verduzco, M. Harwani, D. Mejia, A. Strachan
View Preprint
Jan 2025
Accelerating Active Learning Materials Discovery with FAIR Data and Workflows: A Case Study for Alloy Melting Temperatures
Computational Materials Science (Elsevier)
M. Harwani, J.C. Verduzco, B.H. Lee, A. Strachan
View Paper

Projects

nanoHUB AURA
MULTI-AGENT SYSTEM · CHEMRXIV PREPRINT

Multi-agent research assistant that accesses 1.6M+ data points and 340+ published software tools on nanoHUB, plus HPC clusters, to perform autonomous multi-domain research across biology, physics, and nanotechnology. Produces validated, reproducible results and scales with growing scientific contributions.

Python LLMs Multi-Agent Scientific Computing LangChain
View on nanoHUB
Accelerating Active Learning Materials Discovery
ACTIVE LEARNING · MATERIALS SCIENCE · ELSEVIER PAPER

Active learning loop paired with public physics simulation data to accelerate materials discovery, achieving a 10x speedup in optimal alloy candidate identification. Published in Computational Materials Science, Elsevier.

Python Active Learning FAIR Data Materials Discovery
GitHub
Learned Optimizers for LLM Pretraining
META-LEARNING · OPTIMIZATION

Cost-efficient LLM pretraining using learned optimizers. Achieved a 30% performance gain versus Adam across several zero-shot tasks, improving zero-shot transfer and robustness across downstream tasks.

Python Meta-Learning LLM Pretraining PyTorch
SAM-Stabilized Learned Optimizers
META-LEARNING · OPTIMIZATION

Meta-training pipeline integrating Sharpness-Aware Minimization into learned optimizer training, improving stability and zero-shot transfer across architectures, tasks, and initializations.

Python SAM Meta-Learning PyTorch
KPEvents — Kaiser Permanente
ON-DEVICE AI · iOS

Intelligent health calendar prototype that generates personalized fitness and medication schedules using Kaiser clinical records and Apple HealthKit data. Implemented a privacy-first architecture using Apple Foundation Models with all health data processed and stored fully on-device.

Swift/SwiftUI Apple Foundation Models HealthKit On-Device AI
Zeroeth-Order Optimization for Neural Network Training
OPTIMIZATION · CMA-ES

Zeroth-order gradient estimation to accelerate training and improve generalization. CMA-ES-based warmup optimizer that outperforms SGD in early training for noisy, nonconvex loss landscapes.

Python CMA-ES Zeroth-Order Generalization PyTorch
Baselining Generalization of Training Algorithms
BENCHMARKING · OPTIMIZATION

Robust benchmarking suite to compare generalization across optimizers for neural networks. Designed evaluation protocols to measure optimizer stability, consistency, and rollout quality across diverse tasks and architectures.

Python Benchmarking Optimizer Evaluation PyTorch
Adaptive Splay Tree
DATA STRUCTURES

Data structure combining Splay Trees with Hash Tables to accelerate access of nearby and recently queried information in O(log n). Exploits temporal and spatial locality to reduce amortized lookup cost.

C++ Splay Tree Hash Table Data Structures
GitHub
Lightweight File System Retrieval
DATA STRUCTURES

In-memory hierarchical pseudo-filesystem supporting efficient retrieval using Heaps. Implements core filesystem operations (insert, search, delete) with optimized access patterns for hierarchical data.

C++ Heaps File System Data Structures
GitHub
Graph Analysis Engine
ALGORITHMS

Graph algorithms suite implementing SCC detection, maximum spanning trees, and constrained pathfinding for clustering and network optimization problems.

C++ SCC Spanning Trees Graph Algorithms
GitHub
Stack-Based Maze Solver
ALGORITHMS

Stack-driven DFS and queue-based BFS traversal for generating and solving grid-based mazes. Implements both iterative deepening and breadth-first strategies for pathfinding.

C++ DFS BFS Pathfinding
GitHub
Message Board
WEB APPLICATION

Message board web application built for CS 180 at Purdue University.

Java Web App
GitHub
🎤

Conference & Workshop Presentations

Feb 2026
AURA: Agentic AI meets nanoHUB's FAIR Workflows and Data
Materials Research and Data Alliance (MaRDA)
Living in the Material World
Feb 2026
Leveraging Public Simulation Data to Accelerate Alloy Discovery and Optimization
Materials Research and Data Alliance (MaRDA)
Living in the Material World
Nov 2025
Integrating AI into nanoHUB: Toward Intelligent and Connected Scientific Workflows
Google AI Summit
Indianapolis, IN
Poster PDF
Jun 2025
nanoRA: A Multi-Agent Research Assistant for Autonomous Scientific Workflows on nanoHUB
AI for Materials Science (AIMS) Workshop
Washington, DC
Poster PDF
Dec 2024
The Impacts of Derivative-Free Optimization on Generalization of Neural Networks
RL and Out-of-Distribution ML Workshop
Purdue University

Connect