Picture for Yash Sinha

Yash Sinha

Trust Regions Sell, But Who's Buying? Overlap Geometry as an Alternative Trust Region for Policy Optimization

Add code
Feb 06, 2026
Viaarxiv icon

The Compliance Paradox: Semantic-Instruction Decoupling in Automated Academic Code Evaluation

Add code
Jan 29, 2026
Viaarxiv icon

When Reject Turns into Accept: Quantifying the Vulnerability of LLM-Based Scientific Reviewers to Indirect Prompt Injection

Add code
Dec 15, 2025
Viaarxiv icon

How to Trick Your AI TA: A Systematic Study of Academic Jailbreaking in LLM Code Evaluation

Add code
Dec 11, 2025
Viaarxiv icon

OrgAccess: A Benchmark for Role Based Access Control in Organization Scale LLMs

Add code
May 25, 2025
Figure 1 for OrgAccess: A Benchmark for Role Based Access Control in Organization Scale LLMs
Figure 2 for OrgAccess: A Benchmark for Role Based Access Control in Organization Scale LLMs
Figure 3 for OrgAccess: A Benchmark for Role Based Access Control in Organization Scale LLMs
Figure 4 for OrgAccess: A Benchmark for Role Based Access Control in Organization Scale LLMs
Viaarxiv icon

UnStar: Unlearning with Self-Taught Anti-Sample Reasoning for LLMs

Add code
Oct 22, 2024
Figure 1 for UnStar: Unlearning with Self-Taught Anti-Sample Reasoning for LLMs
Figure 2 for UnStar: Unlearning with Self-Taught Anti-Sample Reasoning for LLMs
Figure 3 for UnStar: Unlearning with Self-Taught Anti-Sample Reasoning for LLMs
Figure 4 for UnStar: Unlearning with Self-Taught Anti-Sample Reasoning for LLMs
Viaarxiv icon

Multi-Modal Recommendation Unlearning

Add code
May 24, 2024
Figure 1 for Multi-Modal Recommendation Unlearning
Figure 2 for Multi-Modal Recommendation Unlearning
Figure 3 for Multi-Modal Recommendation Unlearning
Figure 4 for Multi-Modal Recommendation Unlearning
Viaarxiv icon

Distill to Delete: Unlearning in Graph Networks with Knowledge Distillation

Add code
Sep 28, 2023
Figure 1 for Distill to Delete: Unlearning in Graph Networks with Knowledge Distillation
Figure 2 for Distill to Delete: Unlearning in Graph Networks with Knowledge Distillation
Figure 3 for Distill to Delete: Unlearning in Graph Networks with Knowledge Distillation
Figure 4 for Distill to Delete: Unlearning in Graph Networks with Knowledge Distillation
Viaarxiv icon