Picture for Boyang Zhang

Boyang Zhang

HeRo-Q: A General Framework for Stable Low Bit Quantization via Hessian Conditioning

Add code
Jan 29, 2026
Viaarxiv icon

MoE-DisCo:Low Economy Cost Training Mixture-of-Experts Models

Add code
Jan 11, 2026
Viaarxiv icon

FLEX-MoE: Federated Mixture-of-Experts with Load-balanced Expert Assignment

Add code
Dec 28, 2025
Viaarxiv icon

Rethinking Parameter Sharing as Graph Coloring for Structured Compression

Add code
Nov 10, 2025
Viaarxiv icon

A Vision-Based Collision Sensing Method for Stable Circular Object Grasping with A Soft Gripper System

Add code
Aug 07, 2025
Viaarxiv icon

SP2RINT: Spatially-Decoupled Physics-Inspired Progressive Inverse Optimization for Scalable, PDE-Constrained Meta-Optical Neural Network Training

Add code
May 23, 2025
Viaarxiv icon

Can the capability of Large Language Models be described by human ability? A Meta Study

Add code
Apr 13, 2025
Figure 1 for Can the capability of Large Language Models be described by human ability? A Meta Study
Figure 2 for Can the capability of Large Language Models be described by human ability? A Meta Study
Figure 3 for Can the capability of Large Language Models be described by human ability? A Meta Study
Figure 4 for Can the capability of Large Language Models be described by human ability? A Meta Study
Viaarxiv icon

A General Error-Theoretical Analysis Framework for Constructing Compression Strategies

Add code
Feb 19, 2025
Viaarxiv icon

FP=xINT:A Low-Bit Series Expansion Algorithm for Post-Training Quantization

Add code
Dec 09, 2024
Figure 1 for FP=xINT:A Low-Bit Series Expansion Algorithm for Post-Training Quantization
Figure 2 for FP=xINT:A Low-Bit Series Expansion Algorithm for Post-Training Quantization
Figure 3 for FP=xINT:A Low-Bit Series Expansion Algorithm for Post-Training Quantization
Figure 4 for FP=xINT:A Low-Bit Series Expansion Algorithm for Post-Training Quantization
Viaarxiv icon

Compression for Better: A General and Stable Lossless Compression Framework

Add code
Dec 09, 2024
Viaarxiv icon