Picture for Nandan Kumar Jha

Nandan Kumar Jha

TruncFormer: Private LLM Inference Using Only Truncations

Add code
Dec 02, 2024
Figure 1 for TruncFormer: Private LLM Inference Using Only Truncations
Figure 2 for TruncFormer: Private LLM Inference Using Only Truncations
Figure 3 for TruncFormer: Private LLM Inference Using Only Truncations
Figure 4 for TruncFormer: Private LLM Inference Using Only Truncations
Viaarxiv icon

AERO: Softmax-Only LLMs for Efficient Private Inference

Add code
Oct 16, 2024
Viaarxiv icon

ReLU's Revival: On the Entropic Overload in Normalization-Free Large Language Models

Add code
Oct 12, 2024
Viaarxiv icon

Sisyphus: A Cautionary Tale of Using Low-Degree Polynomial Activations in Privacy-Preserving Deep Learning

Add code
Jul 26, 2021
Figure 1 for Sisyphus: A Cautionary Tale of Using Low-Degree Polynomial Activations in Privacy-Preserving Deep Learning
Figure 2 for Sisyphus: A Cautionary Tale of Using Low-Degree Polynomial Activations in Privacy-Preserving Deep Learning
Figure 3 for Sisyphus: A Cautionary Tale of Using Low-Degree Polynomial Activations in Privacy-Preserving Deep Learning
Figure 4 for Sisyphus: A Cautionary Tale of Using Low-Degree Polynomial Activations in Privacy-Preserving Deep Learning
Viaarxiv icon

Circa: Stochastic ReLUs for Private Deep Learning

Add code
Jun 15, 2021
Figure 1 for Circa: Stochastic ReLUs for Private Deep Learning
Figure 2 for Circa: Stochastic ReLUs for Private Deep Learning
Figure 3 for Circa: Stochastic ReLUs for Private Deep Learning
Figure 4 for Circa: Stochastic ReLUs for Private Deep Learning
Viaarxiv icon

DeepReDuce: ReLU Reduction for Fast Private Inference

Add code
Mar 02, 2021
Figure 1 for DeepReDuce: ReLU Reduction for Fast Private Inference
Figure 2 for DeepReDuce: ReLU Reduction for Fast Private Inference
Figure 3 for DeepReDuce: ReLU Reduction for Fast Private Inference
Figure 4 for DeepReDuce: ReLU Reduction for Fast Private Inference
Viaarxiv icon

Modeling Data Reuse in Deep Neural Networks by Taking Data-Types into Cognizance

Add code
Aug 06, 2020
Figure 1 for Modeling Data Reuse in Deep Neural Networks by Taking Data-Types into Cognizance
Figure 2 for Modeling Data Reuse in Deep Neural Networks by Taking Data-Types into Cognizance
Figure 3 for Modeling Data Reuse in Deep Neural Networks by Taking Data-Types into Cognizance
Figure 4 for Modeling Data Reuse in Deep Neural Networks by Taking Data-Types into Cognizance
Viaarxiv icon

DeepPeep: Exploiting Design Ramifications to Decipher the Architecture of Compact DNNs

Add code
Jul 30, 2020
Figure 1 for DeepPeep: Exploiting Design Ramifications to Decipher the Architecture of Compact DNNs
Figure 2 for DeepPeep: Exploiting Design Ramifications to Decipher the Architecture of Compact DNNs
Figure 3 for DeepPeep: Exploiting Design Ramifications to Decipher the Architecture of Compact DNNs
Figure 4 for DeepPeep: Exploiting Design Ramifications to Decipher the Architecture of Compact DNNs
Viaarxiv icon

On the Demystification of Knowledge Distillation: A Residual Network Perspective

Add code
Jun 30, 2020
Figure 1 for On the Demystification of Knowledge Distillation: A Residual Network Perspective
Figure 2 for On the Demystification of Knowledge Distillation: A Residual Network Perspective
Figure 3 for On the Demystification of Knowledge Distillation: A Residual Network Perspective
Figure 4 for On the Demystification of Knowledge Distillation: A Residual Network Perspective
Viaarxiv icon

DRACO: Co-Optimizing Hardware Utilization, and Performance of DNNs on Systolic Accelerator

Add code
Jun 26, 2020
Figure 1 for DRACO: Co-Optimizing Hardware Utilization, and Performance of DNNs on Systolic Accelerator
Figure 2 for DRACO: Co-Optimizing Hardware Utilization, and Performance of DNNs on Systolic Accelerator
Figure 3 for DRACO: Co-Optimizing Hardware Utilization, and Performance of DNNs on Systolic Accelerator
Figure 4 for DRACO: Co-Optimizing Hardware Utilization, and Performance of DNNs on Systolic Accelerator
Viaarxiv icon