Picture for Xindian Ma

Xindian Ma

CrossQuant: A Post-Training Quantization Method with Smaller Quantization Kernel for Precise Large Language Model Compression

Add code
Oct 10, 2024
Viaarxiv icon

3D-RPE: Enhancing Long-Context Modeling Through 3D Rotary Position Encoding

Add code
Jun 14, 2024
Viaarxiv icon

TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling

Add code
Aug 12, 2020
Figure 1 for TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling
Figure 2 for TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling
Figure 3 for TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling
Figure 4 for TensorCoder: Dimension-Wise Attention via Tensor Representation for Natural Language Modeling
Viaarxiv icon

A Tensorized Transformer for Language Modeling

Add code
Aug 09, 2019
Figure 1 for A Tensorized Transformer for Language Modeling
Figure 2 for A Tensorized Transformer for Language Modeling
Figure 3 for A Tensorized Transformer for Language Modeling
Figure 4 for A Tensorized Transformer for Language Modeling
Viaarxiv icon

A Generalized Language Model in Tensor Space

Add code
Jan 31, 2019
Figure 1 for A Generalized Language Model in Tensor Space
Figure 2 for A Generalized Language Model in Tensor Space
Figure 3 for A Generalized Language Model in Tensor Space
Figure 4 for A Generalized Language Model in Tensor Space
Viaarxiv icon