Picture for Shiyang Chen

Shiyang Chen

PrisonBreak: Jailbreaking Large Language Models with Fewer Than Twenty-Five Targeted Bit-flips

Add code
Dec 10, 2024
Viaarxiv icon

FP6-LLM: Efficiently Serving Large Language Models Through FP6-Centric Algorithm-System Co-Design

Add code
Jan 25, 2024
Viaarxiv icon

ZeroQuant(4+2): Redefining LLMs Quantization with a New FP6-Centric Strategy for Diverse Generative Tasks

Add code
Dec 18, 2023
Figure 1 for ZeroQuant(4+2): Redefining LLMs Quantization with a New FP6-Centric Strategy for Diverse Generative Tasks
Figure 2 for ZeroQuant(4+2): Redefining LLMs Quantization with a New FP6-Centric Strategy for Diverse Generative Tasks
Figure 3 for ZeroQuant(4+2): Redefining LLMs Quantization with a New FP6-Centric Strategy for Diverse Generative Tasks
Figure 4 for ZeroQuant(4+2): Redefining LLMs Quantization with a New FP6-Centric Strategy for Diverse Generative Tasks
Viaarxiv icon

DeepSpeed4Science Initiative: Enabling Large-Scale Scientific Discovery through Sophisticated AI System Technologies

Add code
Oct 11, 2023
Figure 1 for DeepSpeed4Science Initiative: Enabling Large-Scale Scientific Discovery through Sophisticated AI System Technologies
Figure 2 for DeepSpeed4Science Initiative: Enabling Large-Scale Scientific Discovery through Sophisticated AI System Technologies
Figure 3 for DeepSpeed4Science Initiative: Enabling Large-Scale Scientific Discovery through Sophisticated AI System Technologies
Figure 4 for DeepSpeed4Science Initiative: Enabling Large-Scale Scientific Discovery through Sophisticated AI System Technologies
Viaarxiv icon

Tango: rethinking quantization for graph neural network training on GPUs

Add code
Aug 02, 2023
Viaarxiv icon

Motif-based Graph Representation Learning with Application to Chemical Molecules

Add code
Aug 09, 2022
Figure 1 for Motif-based Graph Representation Learning with Application to Chemical Molecules
Figure 2 for Motif-based Graph Representation Learning with Application to Chemical Molecules
Figure 3 for Motif-based Graph Representation Learning with Application to Chemical Molecules
Figure 4 for Motif-based Graph Representation Learning with Application to Chemical Molecules
Viaarxiv icon

A Length Adaptive Algorithm-Hardware Co-design of Transformer on FPGA Through Sparse Attention and Dynamic Pipelining

Add code
Aug 07, 2022
Figure 1 for A Length Adaptive Algorithm-Hardware Co-design of Transformer on FPGA Through Sparse Attention and Dynamic Pipelining
Figure 2 for A Length Adaptive Algorithm-Hardware Co-design of Transformer on FPGA Through Sparse Attention and Dynamic Pipelining
Figure 3 for A Length Adaptive Algorithm-Hardware Co-design of Transformer on FPGA Through Sparse Attention and Dynamic Pipelining
Figure 4 for A Length Adaptive Algorithm-Hardware Co-design of Transformer on FPGA Through Sparse Attention and Dynamic Pipelining
Viaarxiv icon

Transfer learning of phase transitions in percolation and directed percolation

Add code
Jan 06, 2022
Figure 1 for Transfer learning of phase transitions in percolation and directed percolation
Figure 2 for Transfer learning of phase transitions in percolation and directed percolation
Figure 3 for Transfer learning of phase transitions in percolation and directed percolation
Figure 4 for Transfer learning of phase transitions in percolation and directed percolation
Viaarxiv icon

Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm

Add code
Oct 18, 2021
Figure 1 for Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm
Figure 2 for Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm
Figure 3 for Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm
Figure 4 for Sparse Progressive Distillation: Resolving Overfitting under Pretrain-and-Finetune Paradigm
Viaarxiv icon