Picture for Shuhui Qu

Shuhui Qu

Bernie

Adaptive Test-Time Compute Allocation via Learned Heuristics over Categorical Structure

Add code
Feb 03, 2026
Viaarxiv icon

Active Epistemic Control for Query-Efficient Verified Planning

Add code
Feb 03, 2026
Viaarxiv icon

Fuzzy Categorical Planning: Autonomous Goal Satisfaction with Graded Semantic Constraints

Add code
Jan 27, 2026
Viaarxiv icon

Teaching LLMs to Ask: Self-Querying Category-Theoretic Planning for Under-Specified Reasoning

Add code
Jan 27, 2026
Viaarxiv icon

GoMS: Graph of Molecule Substructure Network for Molecule Property Prediction

Add code
Dec 13, 2025
Viaarxiv icon

Patch-aware Vector Quantized Codebook Learning for Unsupervised Visual Defect Detection

Add code
Jan 15, 2025
Figure 1 for Patch-aware Vector Quantized Codebook Learning for Unsupervised Visual Defect Detection
Figure 2 for Patch-aware Vector Quantized Codebook Learning for Unsupervised Visual Defect Detection
Figure 3 for Patch-aware Vector Quantized Codebook Learning for Unsupervised Visual Defect Detection
Figure 4 for Patch-aware Vector Quantized Codebook Learning for Unsupervised Visual Defect Detection
Viaarxiv icon

Efficient Generation of Molecular Clusters with Dual-Scale Equivariant Flow Matching

Add code
Oct 10, 2024
Viaarxiv icon

SHAPNN: Shapley Value Regularized Tabular Neural Network

Add code
Sep 15, 2023
Viaarxiv icon

Error-aware Quantization through Noise Tempering

Add code
Dec 11, 2022
Figure 1 for Error-aware Quantization through Noise Tempering
Figure 2 for Error-aware Quantization through Noise Tempering
Figure 3 for Error-aware Quantization through Noise Tempering
Figure 4 for Error-aware Quantization through Noise Tempering
Viaarxiv icon

SQuAT: Sharpness- and Quantization-Aware Training for BERT

Add code
Oct 13, 2022
Figure 1 for SQuAT: Sharpness- and Quantization-Aware Training for BERT
Figure 2 for SQuAT: Sharpness- and Quantization-Aware Training for BERT
Figure 3 for SQuAT: Sharpness- and Quantization-Aware Training for BERT
Figure 4 for SQuAT: Sharpness- and Quantization-Aware Training for BERT
Viaarxiv icon