Picture for Geraldo F. Oliveira

Geraldo F. Oliveira

TransPimLib: A Library for Efficient Transcendental Functions on Processing-in-Memory Systems

Add code
Apr 23, 2023
Viaarxiv icon

Accelerating Neural Network Inference with Processing-in-DRAM: From the Edge to the Cloud

Add code
Sep 19, 2022
Figure 1 for Accelerating Neural Network Inference with Processing-in-DRAM: From the Edge to the Cloud
Figure 2 for Accelerating Neural Network Inference with Processing-in-DRAM: From the Edge to the Cloud
Figure 3 for Accelerating Neural Network Inference with Processing-in-DRAM: From the Edge to the Cloud
Figure 4 for Accelerating Neural Network Inference with Processing-in-DRAM: From the Edge to the Cloud
Viaarxiv icon

An Experimental Evaluation of Machine Learning Training on a Real Processing-in-Memory System

Add code
Jul 16, 2022
Figure 1 for An Experimental Evaluation of Machine Learning Training on a Real Processing-in-Memory System
Viaarxiv icon

Machine Learning Training on a Real Processing-in-Memory System

Add code
Jun 13, 2022
Figure 1 for Machine Learning Training on a Real Processing-in-Memory System
Viaarxiv icon

Heterogeneous Data-Centric Architectures for Modern Data-Intensive Applications: Case Studies in Machine Learning and Databases

Add code
May 29, 2022
Figure 1 for Heterogeneous Data-Centric Architectures for Modern Data-Intensive Applications: Case Studies in Machine Learning and Databases
Viaarxiv icon

Google Neural Network Models for Edge Devices: Analyzing and Mitigating Machine Learning Inference Bottlenecks

Add code
Sep 29, 2021
Figure 1 for Google Neural Network Models for Edge Devices: Analyzing and Mitigating Machine Learning Inference Bottlenecks
Figure 2 for Google Neural Network Models for Edge Devices: Analyzing and Mitigating Machine Learning Inference Bottlenecks
Figure 3 for Google Neural Network Models for Edge Devices: Analyzing and Mitigating Machine Learning Inference Bottlenecks
Figure 4 for Google Neural Network Models for Edge Devices: Analyzing and Mitigating Machine Learning Inference Bottlenecks
Viaarxiv icon

Mitigating Edge Machine Learning Inference Bottlenecks: An Empirical Study on Accelerating Google Edge Models

Add code
Mar 01, 2021
Figure 1 for Mitigating Edge Machine Learning Inference Bottlenecks: An Empirical Study on Accelerating Google Edge Models
Figure 2 for Mitigating Edge Machine Learning Inference Bottlenecks: An Empirical Study on Accelerating Google Edge Models
Figure 3 for Mitigating Edge Machine Learning Inference Bottlenecks: An Empirical Study on Accelerating Google Edge Models
Figure 4 for Mitigating Edge Machine Learning Inference Bottlenecks: An Empirical Study on Accelerating Google Edge Models
Viaarxiv icon