Picture for Markus Götz

Markus Götz

Beyond Backpropagation: Optimization with Multi-Tangent Forward Gradients

Add code
Oct 23, 2024
Viaarxiv icon

ReCycle: Fast and Efficient Long Time Series Forecasting with Residual Cyclic Transformers

Add code
May 06, 2024
Viaarxiv icon

AB-Training: A Communication-Efficient Approach for Distributed Low-Rank Learning

Add code
May 02, 2024
Viaarxiv icon

Harnessing Orthogonality to Train Low-Rank Neural Networks

Add code
Jan 16, 2024
Viaarxiv icon

Feed-Forward Optimization With Delayed Feedback for Neural Networks

Add code
Apr 26, 2023
Viaarxiv icon

Massively Parallel Genetic Optimization through Asynchronous Propagation of Populations

Add code
Jan 20, 2023
Viaarxiv icon

Precise Energy Consumption Measurements of Heterogeneous Artificial Intelligence Workloads

Add code
Dec 03, 2022
Viaarxiv icon

Learning Tree Structures from Leaves For Particle Decay Reconstruction

Add code
Sep 01, 2022
Figure 1 for Learning Tree Structures from Leaves For Particle Decay Reconstruction
Figure 2 for Learning Tree Structures from Leaves For Particle Decay Reconstruction
Figure 3 for Learning Tree Structures from Leaves For Particle Decay Reconstruction
Figure 4 for Learning Tree Structures from Leaves For Particle Decay Reconstruction
Viaarxiv icon

HyDe: The First Open-Source, Python-Based, GPU-Accelerated Hyperspectral Denoising Package

Add code
Apr 14, 2022
Figure 1 for HyDe: The First Open-Source, Python-Based, GPU-Accelerated Hyperspectral Denoising Package
Figure 2 for HyDe: The First Open-Source, Python-Based, GPU-Accelerated Hyperspectral Denoising Package
Figure 3 for HyDe: The First Open-Source, Python-Based, GPU-Accelerated Hyperspectral Denoising Package
Viaarxiv icon

Accelerating Neural Network Training with Distributed Asynchronous and Selective Optimization (DASO)

Add code
Apr 15, 2021
Figure 1 for Accelerating Neural Network Training with Distributed Asynchronous and Selective Optimization (DASO)
Figure 2 for Accelerating Neural Network Training with Distributed Asynchronous and Selective Optimization (DASO)
Figure 3 for Accelerating Neural Network Training with Distributed Asynchronous and Selective Optimization (DASO)
Figure 4 for Accelerating Neural Network Training with Distributed Asynchronous and Selective Optimization (DASO)
Viaarxiv icon