Picture for Boxun Xu

Boxun Xu

Towards 3D Acceleration for low-power Mixture-of-Experts and Multi-Head Attention Spiking Transformers

Add code
Dec 07, 2024
Viaarxiv icon

Trimming Down Large Spiking Vision Transformers via Heterogeneous Quantization Search

Add code
Dec 07, 2024
Viaarxiv icon

Spiking Transformer Hardware Accelerators in 3D Integration

Add code
Nov 11, 2024
Viaarxiv icon

ADO-LLM: Analog Design Bayesian Optimization with In-Context Learning of Large Language Models

Add code
Jun 26, 2024
Viaarxiv icon

DISTA: Denoising Spiking Transformer with intrinsic plasticity and spatiotemporal attention

Add code
Nov 15, 2023
Viaarxiv icon

UPAR: A Kantian-Inspired Prompting Framework for Enhancing Large Language Model Capabilities

Add code
Sep 30, 2023
Viaarxiv icon