Picture for Reto Gubelmann

Reto Gubelmann

University of St.Gallen

Uncovering More Shallow Heuristics: Probing the Natural Language Inference Capacities of Transformer-Based Pre-Trained Language Models Using Syllogistic Patterns

Add code
Jan 19, 2022
Viaarxiv icon

Exploring the Promises of Transformer-Based LMs for the Representation of Normative Claims in the Legal Domain

Add code
Aug 25, 2021
Figure 1 for Exploring the Promises of Transformer-Based LMs for the Representation of Normative Claims in the Legal Domain
Figure 2 for Exploring the Promises of Transformer-Based LMs for the Representation of Normative Claims in the Legal Domain
Figure 3 for Exploring the Promises of Transformer-Based LMs for the Representation of Normative Claims in the Legal Domain
Figure 4 for Exploring the Promises of Transformer-Based LMs for the Representation of Normative Claims in the Legal Domain
Viaarxiv icon