I am a Research Fellow in Machine Learning and Optimization Group at Microsoft Research Lab - India. I am advised by Dr. Manik Varma and Dr. Prateek Jain. I am working as part of the Intelligent Devices Expedition Project aimed at developing novel Machine Learning algorithms to make Edge Devices intelligent. My current focus is on fundamental machine learning and deep learning algorithms for resource-efficient and general settings.
The latest output of this project, FastGRNN: A Fast, Accurate, Stable and Tiny Kilobyte Sized Gated Recurrent Neural Network, is accepted for publication at NeurIPS'18.
Before joining MSR, I was an Undergraduate at Indian Institute of Technology Bombay with major (honors) in Computer Science and Engineering and minor in Electrical Engineering. My undergraduate thesis, along with Anand Dhoot, under the guidance of Prof. Soumen Chakrabarti, was focused on making Entity-Typing efficient and geometrically sensible using supervised and self-supervised learning in order to help quite a few down-stream applications like Knowledge Base Completion, Fine Type Tagging etc.,
I spent my second year undergrad summers at Inria, working in Titane Team under Dr. Pierre Alliez working on Stochastic Mesh Metric Generation for 3D modeling and my next summer at American Express Big Data Labs working on making Gradient Boosted Machines effective under guidance of Dr. Vishwa Vinay.
I am one of the first and major contributors of EdgeML, An ML library for machine learning on the Edge, which proclaims 2 KB (RAM) ought to be enough for everybody. Ping me if you think otherwise, *wink*
I am interested in Machine Learning and Systems in general with a special interest in creating novel and generalisable Machine/Deep Learning algorithms in both Resource Constrained and Large Scale settings, particularly for Search, Time Series and Vision along with a fundamental understanding about the algorithms we propose.