Efficient AI Computing,
Transforming the Future.

Ji Lin

Ph.D

(Graduated)

Ji Lin graduated from MIT HAN Lab in Dec. 2023 and joined OpenAI as a research scientist. His research focuses on efficient deep learning computing, systems for ML and recently, accelerating large language models (LLMs). Ji is pioneering the research in the field of TinyML. His research has received over 10,000 citations on Google Scholar and over 8,000 stars on GitHub. His work on LLM quantization (AWQ) received the best paper award at MLSys'24. AWQ has been widely adopted by NVIDIA, Intel, Microsoft, AMD, HuggingFace, Berkeley to accelerate LLM inference. AWQ-quantized LLMs have been downloaded by more than 6 million times on HuggingFace. Ji is an NVIDIA Graduate Fellowship Finalist in 2020, and Qualcomm Innovation Fellowship recipient in 2022. His work has been covered by MIT Tech Review, MIT News (twice on MIT homepage and four times on MIT News), WIRED, Engadget, VentureBeat, etc.

Honors and Fellowships

Ji Lin
received
the 2022 Qualcomm Innovation Fellowship
.
Ji Lin
received
the 2020 Nvidia Graduate Fellowship Finalist
.

Competition Awards

No items found.

Awards

No items found.

Open source projects with over 1K GitHub stars

Projects

Blog Posts

Talks

Aug 2023
Ji Lin
presented "
SmoothQuant, AWQ, TinyChat
" at
UC Berkeley SkyLab
.
VideoSlidesMedia
Jun 2023
Ji Lin
presented "
SmoothQuant, AWQ
" at
NVIDIA
.
VideoSlidesMedia