Picture for Dana Angluin

Dana Angluin

Yale University

Simulating Hard Attention Using Soft Attention

Add code
Dec 13, 2024
Viaarxiv icon

Transformers as Transducers

Add code
Apr 02, 2024
Viaarxiv icon

Transformers as Recognizers of Formal Languages: A Survey on Expressivity

Add code
Nov 01, 2023
Viaarxiv icon

Masked Hard-Attention Transformers and Boolean RASP Recognize Exactly the Star-Free Languages

Add code
Oct 21, 2023
Viaarxiv icon

Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity

Add code
Apr 13, 2022
Figure 1 for Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity
Viaarxiv icon

Regular omega-Languages with an Informative Right Congruence

Add code
Sep 10, 2018
Figure 1 for Regular omega-Languages with an Informative Right Congruence
Figure 2 for Regular omega-Languages with an Informative Right Congruence
Figure 3 for Regular omega-Languages with an Informative Right Congruence
Figure 4 for Regular omega-Languages with an Informative Right Congruence
Viaarxiv icon

Context-Free Transductions with Neural Stacks

Add code
Sep 08, 2018
Figure 1 for Context-Free Transductions with Neural Stacks
Figure 2 for Context-Free Transductions with Neural Stacks
Figure 3 for Context-Free Transductions with Neural Stacks
Figure 4 for Context-Free Transductions with Neural Stacks
Viaarxiv icon