Picture for Mohammed Sabry

Mohammed Sabry

ATHAR: A High-Quality and Diverse Dataset for Classical Arabic to English Translation

Add code
Jul 29, 2024
Viaarxiv icon

Assessing the Portability of Parameter Matrices Trained by Parameter-Efficient Finetuning Methods

Add code
Jan 25, 2024
Viaarxiv icon

PEFT-Ref: A Modular Reference Architecture and Typology for Parameter-Efficient Finetuning Techniques

Add code
Apr 24, 2023
Figure 1 for PEFT-Ref: A Modular Reference Architecture and Typology for Parameter-Efficient Finetuning Techniques
Figure 2 for PEFT-Ref: A Modular Reference Architecture and Typology for Parameter-Efficient Finetuning Techniques
Figure 3 for PEFT-Ref: A Modular Reference Architecture and Typology for Parameter-Efficient Finetuning Techniques
Figure 4 for PEFT-Ref: A Modular Reference Architecture and Typology for Parameter-Efficient Finetuning Techniques
Viaarxiv icon

AfriVEC: Word Embedding Models for African Languages. Case Study of Fon and Nobiin

Add code
Mar 18, 2021
Figure 1 for AfriVEC: Word Embedding Models for African Languages. Case Study of Fon and Nobiin
Figure 2 for AfriVEC: Word Embedding Models for African Languages. Case Study of Fon and Nobiin
Figure 3 for AfriVEC: Word Embedding Models for African Languages. Case Study of Fon and Nobiin
Figure 4 for AfriVEC: Word Embedding Models for African Languages. Case Study of Fon and Nobiin
Viaarxiv icon

On the Reduction of Variance and Overestimation of Deep Q-Learning

Add code
Oct 14, 2019
Figure 1 for On the Reduction of Variance and Overestimation of Deep Q-Learning
Figure 2 for On the Reduction of Variance and Overestimation of Deep Q-Learning
Figure 3 for On the Reduction of Variance and Overestimation of Deep Q-Learning
Figure 4 for On the Reduction of Variance and Overestimation of Deep Q-Learning
Viaarxiv icon