Picture for Wanyun Cui

Wanyun Cui

Cherry on Top: Parameter Heterogeneity and Quantization in Large Language Models

Add code
Apr 03, 2024
Viaarxiv icon

Who Said That? Benchmarking Social Media AI Detection

Add code
Oct 12, 2023
Viaarxiv icon

Ada-Instruct: Adapting Instruction Generators for Complex Reasoning

Add code
Oct 10, 2023
Viaarxiv icon

Evade ChatGPT Detectors via A Single Space

Add code
Jul 05, 2023
Figure 1 for Evade ChatGPT Detectors via A Single Space
Figure 2 for Evade ChatGPT Detectors via A Single Space
Figure 3 for Evade ChatGPT Detectors via A Single Space
Figure 4 for Evade ChatGPT Detectors via A Single Space
Viaarxiv icon

Free Lunch for Efficient Textual Commonsense Integration in Language Models

Add code
May 24, 2023
Viaarxiv icon

Exploring Automatically Perturbed Natural Language Explanations in Relation Extraction

Add code
May 24, 2023
Figure 1 for Exploring Automatically Perturbed Natural Language Explanations in Relation Extraction
Figure 2 for Exploring Automatically Perturbed Natural Language Explanations in Relation Extraction
Figure 3 for Exploring Automatically Perturbed Natural Language Explanations in Relation Extraction
Figure 4 for Exploring Automatically Perturbed Natural Language Explanations in Relation Extraction
Viaarxiv icon

Instance-based Learning for Knowledge Base Completion

Add code
Nov 13, 2022
Viaarxiv icon

Open Rule Induction

Add code
Oct 26, 2021
Figure 1 for Open Rule Induction
Figure 2 for Open Rule Induction
Figure 3 for Open Rule Induction
Figure 4 for Open Rule Induction
Viaarxiv icon

Enhancing Language Models with Plug-and-Play Large-Scale Commonsense

Add code
Sep 19, 2021
Figure 1 for Enhancing Language Models with Plug-and-Play Large-Scale Commonsense
Figure 2 for Enhancing Language Models with Plug-and-Play Large-Scale Commonsense
Figure 3 for Enhancing Language Models with Plug-and-Play Large-Scale Commonsense
Figure 4 for Enhancing Language Models with Plug-and-Play Large-Scale Commonsense
Viaarxiv icon

Isotonic Data Augmentation for Knowledge Distillation

Add code
Jul 06, 2021
Figure 1 for Isotonic Data Augmentation for Knowledge Distillation
Figure 2 for Isotonic Data Augmentation for Knowledge Distillation
Figure 3 for Isotonic Data Augmentation for Knowledge Distillation
Figure 4 for Isotonic Data Augmentation for Knowledge Distillation
Viaarxiv icon