Picture for Itamar Zimerman

Itamar Zimerman

CLIMP: Contrastive Language-Image Mamba Pretraining

Add code
Jan 11, 2026
Viaarxiv icon

Neural Brain Fields: A NeRF-Inspired Approach for Generating Nonexistent EEG Electrodes

Add code
Dec 20, 2025
Viaarxiv icon

Efficient Decoding Methods for Language Models on Encrypted Data

Add code
Sep 10, 2025
Viaarxiv icon

Differential Mamba

Add code
Jul 08, 2025
Viaarxiv icon

Overclocking LLM Reasoning: Monitoring and Controlling Thinking Path Lengths in LLMs

Add code
Jun 08, 2025
Figure 1 for Overclocking LLM Reasoning: Monitoring and Controlling Thinking Path Lengths in LLMs
Figure 2 for Overclocking LLM Reasoning: Monitoring and Controlling Thinking Path Lengths in LLMs
Figure 3 for Overclocking LLM Reasoning: Monitoring and Controlling Thinking Path Lengths in LLMs
Figure 4 for Overclocking LLM Reasoning: Monitoring and Controlling Thinking Path Lengths in LLMs
Viaarxiv icon

Overflow Prevention Enhances Long-Context Recurrent LLMs

Add code
May 12, 2025
Viaarxiv icon

On the Expressivity of Selective State-Space Layers: A Multivariate Polynomial Approach

Add code
Feb 04, 2025
Figure 1 for On the Expressivity of Selective State-Space Layers: A Multivariate Polynomial Approach
Figure 2 for On the Expressivity of Selective State-Space Layers: A Multivariate Polynomial Approach
Figure 3 for On the Expressivity of Selective State-Space Layers: A Multivariate Polynomial Approach
Figure 4 for On the Expressivity of Selective State-Space Layers: A Multivariate Polynomial Approach
Viaarxiv icon

Power-Softmax: Towards Secure LLM Inference over Encrypted Data

Add code
Oct 12, 2024
Figure 1 for Power-Softmax: Towards Secure LLM Inference over Encrypted Data
Figure 2 for Power-Softmax: Towards Secure LLM Inference over Encrypted Data
Figure 3 for Power-Softmax: Towards Secure LLM Inference over Encrypted Data
Figure 4 for Power-Softmax: Towards Secure LLM Inference over Encrypted Data
Viaarxiv icon

DeciMamba: Exploring the Length Extrapolation Potential of Mamba

Add code
Jun 20, 2024
Figure 1 for DeciMamba: Exploring the Length Extrapolation Potential of Mamba
Figure 2 for DeciMamba: Exploring the Length Extrapolation Potential of Mamba
Figure 3 for DeciMamba: Exploring the Length Extrapolation Potential of Mamba
Figure 4 for DeciMamba: Exploring the Length Extrapolation Potential of Mamba
Viaarxiv icon

A Unified Implicit Attention Formulation for Gated-Linear Recurrent Sequence Models

Add code
May 26, 2024
Figure 1 for A Unified Implicit Attention Formulation for Gated-Linear Recurrent Sequence Models
Figure 2 for A Unified Implicit Attention Formulation for Gated-Linear Recurrent Sequence Models
Figure 3 for A Unified Implicit Attention Formulation for Gated-Linear Recurrent Sequence Models
Figure 4 for A Unified Implicit Attention Formulation for Gated-Linear Recurrent Sequence Models
Viaarxiv icon