Picture for Mugdha Pandya

Mugdha Pandya

Hostility Detection in UK Politics: A Dataset on Online Abuse Targeting MPs

Add code
Dec 05, 2024
Viaarxiv icon

Exploring the Influence of Label Aggregation on Minority Voices: Implications for Dataset Bias and Model Training

Add code
Dec 05, 2024
Viaarxiv icon

Dimensions of Online Conflict: Towards Modeling Agonism

Add code
Nov 06, 2023
Viaarxiv icon

Alexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants

Add code
Jun 04, 2021
Figure 1 for Alexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants
Figure 2 for Alexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants
Figure 3 for Alexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants
Figure 4 for Alexa, Google, Siri: What are Your Pronouns? Gender and Anthropomorphism in the Design and Perception of Conversational Assistants
Viaarxiv icon

Intrinsic Bias Metrics Do Not Correlate with Application Bias

Add code
Jan 02, 2021
Figure 1 for Intrinsic Bias Metrics Do Not Correlate with Application Bias
Figure 2 for Intrinsic Bias Metrics Do Not Correlate with Application Bias
Figure 3 for Intrinsic Bias Metrics Do Not Correlate with Application Bias
Figure 4 for Intrinsic Bias Metrics Do Not Correlate with Application Bias
Viaarxiv icon