Picture for Michal Yemini

Michal Yemini

Fast Distributed Optimization over Directed Graphs under Malicious Attacks using Trust

Add code
Jul 09, 2024
Viaarxiv icon

Privacy Preserving Semi-Decentralized Mean Estimation over Intermittently-Connected Networks

Add code
Jun 06, 2024
Viaarxiv icon

Clipped SGD Algorithms for Privacy Preserving Performative Prediction: Bias Amplification and Remedies

Add code
Apr 17, 2024
Viaarxiv icon

The Role of Confidence for Trust-based Resilient Consensus

Add code
Apr 11, 2024
Viaarxiv icon

How Physicality Enables Trust: A New Era of Trust-Centered Cyberphysical Systems

Add code
Nov 13, 2023
Viaarxiv icon

Exploiting Trust for Resilient Hypothesis Testing with Malicious Robots (evolved version)

Add code
Mar 07, 2023
Figure 1 for Exploiting Trust for Resilient Hypothesis Testing with Malicious Robots (evolved version)
Figure 2 for Exploiting Trust for Resilient Hypothesis Testing with Malicious Robots (evolved version)
Figure 3 for Exploiting Trust for Resilient Hypothesis Testing with Malicious Robots (evolved version)
Figure 4 for Exploiting Trust for Resilient Hypothesis Testing with Malicious Robots (evolved version)
Viaarxiv icon

Collaborative Mean Estimation over Intermittently Connected Networks with Peer-To-Peer Privacy

Add code
Feb 28, 2023
Viaarxiv icon

Resilient Distributed Optimization for Multi-Agent Cyberphysical Systems

Add code
Dec 05, 2022
Viaarxiv icon

Exploiting Trust for Resilient Hypothesis Testing with Malicious Robots

Add code
Sep 25, 2022
Figure 1 for Exploiting Trust for Resilient Hypothesis Testing with Malicious Robots
Figure 2 for Exploiting Trust for Resilient Hypothesis Testing with Malicious Robots
Figure 3 for Exploiting Trust for Resilient Hypothesis Testing with Malicious Robots
Figure 4 for Exploiting Trust for Resilient Hypothesis Testing with Malicious Robots
Viaarxiv icon

Multi-Armed Bandits with Self-Information Rewards

Add code
Sep 06, 2022
Figure 1 for Multi-Armed Bandits with Self-Information Rewards
Figure 2 for Multi-Armed Bandits with Self-Information Rewards
Figure 3 for Multi-Armed Bandits with Self-Information Rewards
Figure 4 for Multi-Armed Bandits with Self-Information Rewards
Viaarxiv icon