Picture for Brucek Khailany

Brucek Khailany

SQ-DM: Accelerating Diffusion Models with Aggressive Quantization and Temporal Sparsity

Add code
Jan 26, 2025
Viaarxiv icon

ESPACE: Dimensionality Reduction of Activations for Model Compression

Add code
Oct 07, 2024
Viaarxiv icon

Revisiting VerilogEval: Newer LLMs, In-Context Learning, and Specification-to-RTL Tasks

Add code
Aug 20, 2024
Viaarxiv icon

VerilogCoder: Autonomous Verilog Coding Agents with Graph-based Planning and Abstract Syntax Tree (AST)-based Waveform Tracing Tool

Add code
Aug 15, 2024
Viaarxiv icon

ChipNeMo: Domain-Adapted LLMs for Chip Design

Add code
Nov 13, 2023
Figure 1 for ChipNeMo: Domain-Adapted LLMs for Chip Design
Figure 2 for ChipNeMo: Domain-Adapted LLMs for Chip Design
Figure 3 for ChipNeMo: Domain-Adapted LLMs for Chip Design
Figure 4 for ChipNeMo: Domain-Adapted LLMs for Chip Design
Viaarxiv icon

VerilogEval: Evaluating Large Language Models for Verilog Code Generation

Add code
Sep 14, 2023
Viaarxiv icon

HEAT: Hardware-Efficient Automatic Tensor Decomposition for Transformer Compression

Add code
Nov 30, 2022
Viaarxiv icon

An Adversarial Active Sampling-based Data Augmentation Framework for Manufacturable Chip Design

Add code
Oct 27, 2022
Viaarxiv icon

Large Scale Mask Optimization Via Convolutional Fourier Neural Operator and Litho-Guided Self Training

Add code
Jul 08, 2022
Figure 1 for Large Scale Mask Optimization Via Convolutional Fourier Neural Operator and Litho-Guided Self Training
Figure 2 for Large Scale Mask Optimization Via Convolutional Fourier Neural Operator and Litho-Guided Self Training
Figure 3 for Large Scale Mask Optimization Via Convolutional Fourier Neural Operator and Litho-Guided Self Training
Figure 4 for Large Scale Mask Optimization Via Convolutional Fourier Neural Operator and Litho-Guided Self Training
Viaarxiv icon

Optimal Clipping and Magnitude-aware Differentiation for Improved Quantization-aware Training

Add code
Jun 13, 2022
Figure 1 for Optimal Clipping and Magnitude-aware Differentiation for Improved Quantization-aware Training
Figure 2 for Optimal Clipping and Magnitude-aware Differentiation for Improved Quantization-aware Training
Figure 3 for Optimal Clipping and Magnitude-aware Differentiation for Improved Quantization-aware Training
Figure 4 for Optimal Clipping and Magnitude-aware Differentiation for Improved Quantization-aware Training
Viaarxiv icon