Picture for Yani Ioannou

Yani Ioannou

Department of Electrical and Software Engineering, University of Calgary, Calgary, Canada

Learning Parameter Sharing with Tensor Decompositions and Sparsity

Add code
Nov 14, 2024
Figure 1 for Learning Parameter Sharing with Tensor Decompositions and Sparsity
Figure 2 for Learning Parameter Sharing with Tensor Decompositions and Sparsity
Figure 3 for Learning Parameter Sharing with Tensor Decompositions and Sparsity
Figure 4 for Learning Parameter Sharing with Tensor Decompositions and Sparsity
Viaarxiv icon

Navigating Extremes: Dynamic Sparsity in Large Output Space

Add code
Nov 05, 2024
Viaarxiv icon

What is Left After Distillation? How Knowledge Transfer Impacts Fairness and Bias

Add code
Oct 10, 2024
Viaarxiv icon

Trustworthy and Responsible AI for Human-Centric Autonomous Decision-Making Systems

Add code
Sep 02, 2024
Viaarxiv icon

Meta-GCN: A Dynamically Weighted Loss Minimization Method for Dealing with the Data Imbalance in Graph Neural Networks

Add code
Jun 24, 2024
Viaarxiv icon

Dynamic Sparse Training with Structured Sparsity

Add code
May 03, 2023
Figure 1 for Dynamic Sparse Training with Structured Sparsity
Figure 2 for Dynamic Sparse Training with Structured Sparsity
Figure 3 for Dynamic Sparse Training with Structured Sparsity
Figure 4 for Dynamic Sparse Training with Structured Sparsity
Viaarxiv icon

Bounding generalization error with input compression: An empirical study with infinite-width networks

Add code
Jul 19, 2022
Figure 1 for Bounding generalization error with input compression: An empirical study with infinite-width networks
Figure 2 for Bounding generalization error with input compression: An empirical study with infinite-width networks
Figure 3 for Bounding generalization error with input compression: An empirical study with infinite-width networks
Figure 4 for Bounding generalization error with input compression: An empirical study with infinite-width networks
Viaarxiv icon

Monitoring Shortcut Learning using Mutual Information

Add code
Jun 27, 2022
Figure 1 for Monitoring Shortcut Learning using Mutual Information
Figure 2 for Monitoring Shortcut Learning using Mutual Information
Figure 3 for Monitoring Shortcut Learning using Mutual Information
Figure 4 for Monitoring Shortcut Learning using Mutual Information
Viaarxiv icon

Measuring Neural Net Robustness with Constraints

Add code
Jun 16, 2017
Figure 1 for Measuring Neural Net Robustness with Constraints
Figure 2 for Measuring Neural Net Robustness with Constraints
Figure 3 for Measuring Neural Net Robustness with Constraints
Figure 4 for Measuring Neural Net Robustness with Constraints
Viaarxiv icon

Deep Roots: Improving CNN Efficiency with Hierarchical Filter Groups

Add code
Nov 30, 2016
Figure 1 for Deep Roots: Improving CNN Efficiency with Hierarchical Filter Groups
Figure 2 for Deep Roots: Improving CNN Efficiency with Hierarchical Filter Groups
Figure 3 for Deep Roots: Improving CNN Efficiency with Hierarchical Filter Groups
Figure 4 for Deep Roots: Improving CNN Efficiency with Hierarchical Filter Groups
Viaarxiv icon