Picture for Jingqiao Zhang

Jingqiao Zhang

GUIM -- General User and Item Embedding with Mixture of Representation in E-commerce

Add code
Jul 02, 2022
Figure 1 for GUIM -- General User and Item Embedding with Mixture of Representation in E-commerce
Figure 2 for GUIM -- General User and Item Embedding with Mixture of Representation in E-commerce
Figure 3 for GUIM -- General User and Item Embedding with Mixture of Representation in E-commerce
Figure 4 for GUIM -- General User and Item Embedding with Mixture of Representation in E-commerce
Viaarxiv icon

Improving Contrastive Learning of Sentence Embeddings with Case-Augmented Positives and Retrieved Negatives

Add code
Jun 06, 2022
Figure 1 for Improving Contrastive Learning of Sentence Embeddings with Case-Augmented Positives and Retrieved Negatives
Figure 2 for Improving Contrastive Learning of Sentence Embeddings with Case-Augmented Positives and Retrieved Negatives
Figure 3 for Improving Contrastive Learning of Sentence Embeddings with Case-Augmented Positives and Retrieved Negatives
Figure 4 for Improving Contrastive Learning of Sentence Embeddings with Case-Augmented Positives and Retrieved Negatives
Viaarxiv icon

SAS: Self-Augmented Strategy for Language Model Pre-training

Add code
Jun 14, 2021
Figure 1 for SAS: Self-Augmented Strategy for Language Model Pre-training
Figure 2 for SAS: Self-Augmented Strategy for Language Model Pre-training
Figure 3 for SAS: Self-Augmented Strategy for Language Model Pre-training
Figure 4 for SAS: Self-Augmented Strategy for Language Model Pre-training
Viaarxiv icon

Progressively Stacking 2.0: A Multi-stage Layerwise Training Method for BERT Training Speedup

Add code
Nov 27, 2020
Figure 1 for Progressively Stacking 2.0: A Multi-stage Layerwise Training Method for BERT Training Speedup
Figure 2 for Progressively Stacking 2.0: A Multi-stage Layerwise Training Method for BERT Training Speedup
Figure 3 for Progressively Stacking 2.0: A Multi-stage Layerwise Training Method for BERT Training Speedup
Figure 4 for Progressively Stacking 2.0: A Multi-stage Layerwise Training Method for BERT Training Speedup
Viaarxiv icon

CoRe: An Efficient Coarse-refined Training Framework for BERT

Add code
Nov 27, 2020
Figure 1 for CoRe: An Efficient Coarse-refined Training Framework for BERT
Figure 2 for CoRe: An Efficient Coarse-refined Training Framework for BERT
Figure 3 for CoRe: An Efficient Coarse-refined Training Framework for BERT
Figure 4 for CoRe: An Efficient Coarse-refined Training Framework for BERT
Viaarxiv icon