Efficient AI Computing,
Transforming the Future.

Projects

To choose projects, simply check the boxes of the categories, topics and techniques.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

FlatFormer: Flattened Window Attention for Efficient Point Cloud Transformer

CVPR 2023
 (
)

We present FlatFormer, an efficient ViT architecture for large-scale point cloud analysis.

Retrospective: EIE: Efficient Inference Engine on Sparse and Compressed Neural Network

ISCA 2023
 (
)

EIE proposed to accelerate pruned and compressed neural networks, exploiting weight sparsity, activation sparsity, and 4-bit weight-sharing in neural network accelerators.

BEVFusion: Multi-Task Multi-Sensor Fusion with Unified Bird's-Eye View Representation

ICRA 2023
 (
)

BEVFusion unifies multi-modal features in the shared bird’s-eye view (BEV) representation space, which nicely preserves both geometric and semantic information. It establishes the new state of the art on nuScenes, achieving 1.3% higher mAP and NDS on 3D object detection and 13.6% higher mIoU on BEV map segmentation with 1.9x lower computation cost.

QuEst: Graph Transformer for Quantum Circuit Reliability Estimation

ICCAD 2022
 (
oral
)

We develop graph transformer models to predict the fidelity of quantum circuits on real quantum devices.