Picture for Panos Kalnis

Panos Kalnis

KAUST

RED: Effective Trajectory Representation Learning with Comprehensive Information

Add code
Nov 22, 2024
Viaarxiv icon

Task-Oriented GNNs Training on Large Knowledge Graphs for Accurate and Efficient Modeling

Add code
Mar 09, 2024
Viaarxiv icon

A Universal Question-Answering Platform for Knowledge Graphs

Add code
Mar 01, 2023
Viaarxiv icon

ChatGPT versus Traditional Question Answering for Knowledge Graphs: Current Status and Future Directions Towards Knowledge Graph Chatbots

Add code
Feb 08, 2023
Viaarxiv icon

Rethinking gradient sparsification as total error minimization

Add code
Aug 02, 2021
Figure 1 for Rethinking gradient sparsification as total error minimization
Figure 2 for Rethinking gradient sparsification as total error minimization
Figure 3 for Rethinking gradient sparsification as total error minimization
Figure 4 for Rethinking gradient sparsification as total error minimization
Viaarxiv icon

DeepReduce: A Sparse-tensor Communication Framework for Distributed Deep Learning

Add code
Feb 05, 2021
Figure 1 for DeepReduce: A Sparse-tensor Communication Framework for Distributed Deep Learning
Figure 2 for DeepReduce: A Sparse-tensor Communication Framework for Distributed Deep Learning
Figure 3 for DeepReduce: A Sparse-tensor Communication Framework for Distributed Deep Learning
Figure 4 for DeepReduce: A Sparse-tensor Communication Framework for Distributed Deep Learning
Viaarxiv icon

On the Discrepancy between the Theoretical Analysis and Practical Implementations of Compressed Communication for Distributed Deep Learning

Add code
Nov 19, 2019
Figure 1 for On the Discrepancy between the Theoretical Analysis and Practical Implementations of Compressed Communication for Distributed Deep Learning
Figure 2 for On the Discrepancy between the Theoretical Analysis and Practical Implementations of Compressed Communication for Distributed Deep Learning
Figure 3 for On the Discrepancy between the Theoretical Analysis and Practical Implementations of Compressed Communication for Distributed Deep Learning
Figure 4 for On the Discrepancy between the Theoretical Analysis and Practical Implementations of Compressed Communication for Distributed Deep Learning
Viaarxiv icon

Scaling Distributed Machine Learning with In-Network Aggregation

Add code
Feb 22, 2019
Figure 1 for Scaling Distributed Machine Learning with In-Network Aggregation
Figure 2 for Scaling Distributed Machine Learning with In-Network Aggregation
Figure 3 for Scaling Distributed Machine Learning with In-Network Aggregation
Figure 4 for Scaling Distributed Machine Learning with In-Network Aggregation
Viaarxiv icon