Picture for Reto Gubelmann

Reto Gubelmann

University of St.Gallen

Sentence Smith: Formally Controllable Text Transformation and its Application to Evaluation of Text Embedding Models

Add code
Feb 20, 2025
Viaarxiv icon

Uncovering More Shallow Heuristics: Probing the Natural Language Inference Capacities of Transformer-Based Pre-Trained Language Models Using Syllogistic Patterns

Add code
Jan 19, 2022
Figure 1 for Uncovering More Shallow Heuristics: Probing the Natural Language Inference Capacities of Transformer-Based Pre-Trained Language Models Using Syllogistic Patterns
Figure 2 for Uncovering More Shallow Heuristics: Probing the Natural Language Inference Capacities of Transformer-Based Pre-Trained Language Models Using Syllogistic Patterns
Figure 3 for Uncovering More Shallow Heuristics: Probing the Natural Language Inference Capacities of Transformer-Based Pre-Trained Language Models Using Syllogistic Patterns
Figure 4 for Uncovering More Shallow Heuristics: Probing the Natural Language Inference Capacities of Transformer-Based Pre-Trained Language Models Using Syllogistic Patterns
Viaarxiv icon

Exploring the Promises of Transformer-Based LMs for the Representation of Normative Claims in the Legal Domain

Add code
Aug 25, 2021
Figure 1 for Exploring the Promises of Transformer-Based LMs for the Representation of Normative Claims in the Legal Domain
Figure 2 for Exploring the Promises of Transformer-Based LMs for the Representation of Normative Claims in the Legal Domain
Figure 3 for Exploring the Promises of Transformer-Based LMs for the Representation of Normative Claims in the Legal Domain
Figure 4 for Exploring the Promises of Transformer-Based LMs for the Representation of Normative Claims in the Legal Domain
Viaarxiv icon