Efficient AI Computing,
Transforming the Future.

Projects

To choose projects, simply check the boxes of the categories, topics and techniques.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

QuantumNAS: Noise-Adaptive Search for Robust Quantum Circuits

HPCA 2022
 (
oral
)

Design of Variational Quantum Algorithm Program

Network Augmentation for Tiny Deep Learning

ICLR 2022
 (
)

NetAug is a training technique for tiny neural networks. NetAug embeds the tiny neural networks into larger neural networks as a sub-network to get more guidance during training. NetAug consistently improves the performance of tiny models, achieving up to 2.2% accuracy improvement on ImageNet.

NAAS: Neural Accelerator Architecture Search

DAC 2021
 (
)

As a data-driven approach, NAAS holistically composes highly matched accelerator and neural architectures together with efficient compiler mapping.

Delayed Gradient Averaging: Tolerate the Communication Latency in Federated Learning

NeurIPS 2021
 (
)

We propose Delayed Gradient Averaging (DGA), which delays the averaging step to improve efficiency and allows local computation in parallel to communication.