Picture for Esha Sarkar

Esha Sarkar

Scalable privacy-preserving cancer type prediction with homomorphic encryption

Add code
Apr 12, 2022
Figure 1 for Scalable privacy-preserving cancer type prediction with homomorphic encryption
Figure 2 for Scalable privacy-preserving cancer type prediction with homomorphic encryption
Figure 3 for Scalable privacy-preserving cancer type prediction with homomorphic encryption
Figure 4 for Scalable privacy-preserving cancer type prediction with homomorphic encryption
Viaarxiv icon

PiDAn: A Coherence Optimization Approach for Backdoor Attack Detection and Mitigation in Deep Neural Networks

Add code
Mar 26, 2022
Figure 1 for PiDAn: A Coherence Optimization Approach for Backdoor Attack Detection and Mitigation in Deep Neural Networks
Figure 2 for PiDAn: A Coherence Optimization Approach for Backdoor Attack Detection and Mitigation in Deep Neural Networks
Figure 3 for PiDAn: A Coherence Optimization Approach for Backdoor Attack Detection and Mitigation in Deep Neural Networks
Figure 4 for PiDAn: A Coherence Optimization Approach for Backdoor Attack Detection and Mitigation in Deep Neural Networks
Viaarxiv icon

TRAPDOOR: Repurposing backdoors to detect dataset bias in machine learning-based genomic analysis

Add code
Aug 14, 2021
Figure 1 for TRAPDOOR: Repurposing backdoors to detect dataset bias in machine learning-based genomic analysis
Figure 2 for TRAPDOOR: Repurposing backdoors to detect dataset bias in machine learning-based genomic analysis
Figure 3 for TRAPDOOR: Repurposing backdoors to detect dataset bias in machine learning-based genomic analysis
Figure 4 for TRAPDOOR: Repurposing backdoors to detect dataset bias in machine learning-based genomic analysis
Viaarxiv icon

Explainability Matters: Backdoor Attacks on Medical Imaging

Add code
Dec 30, 2020
Figure 1 for Explainability Matters: Backdoor Attacks on Medical Imaging
Figure 2 for Explainability Matters: Backdoor Attacks on Medical Imaging
Figure 3 for Explainability Matters: Backdoor Attacks on Medical Imaging
Figure 4 for Explainability Matters: Backdoor Attacks on Medical Imaging
Viaarxiv icon

FaceHack: Triggering backdoored facial recognition systems using facial characteristics

Add code
Jun 20, 2020
Figure 1 for FaceHack: Triggering backdoored facial recognition systems using facial characteristics
Figure 2 for FaceHack: Triggering backdoored facial recognition systems using facial characteristics
Figure 3 for FaceHack: Triggering backdoored facial recognition systems using facial characteristics
Figure 4 for FaceHack: Triggering backdoored facial recognition systems using facial characteristics
Viaarxiv icon

Watch your back: Backdoor Attacks in Deep Reinforcement Learning-based Autonomous Vehicle Control Systems

Add code
Mar 17, 2020
Figure 1 for Watch your back: Backdoor Attacks in Deep Reinforcement Learning-based Autonomous Vehicle Control Systems
Figure 2 for Watch your back: Backdoor Attacks in Deep Reinforcement Learning-based Autonomous Vehicle Control Systems
Figure 3 for Watch your back: Backdoor Attacks in Deep Reinforcement Learning-based Autonomous Vehicle Control Systems
Figure 4 for Watch your back: Backdoor Attacks in Deep Reinforcement Learning-based Autonomous Vehicle Control Systems
Viaarxiv icon