Picture for Zhengqi Gao

Zhengqi Gao

PIC2O-Sim: A Physics-Inspired Causality-Aware Dynamic Convolutional Neural Operator for Ultra-Fast Photonic Device FDTD Simulation

Add code
Jun 24, 2024
Viaarxiv icon

On the Theory of Cross-Modality Distillation with Contrastive Learning

Add code
May 06, 2024
Viaarxiv icon

KirchhoffNet: A Circuit Bridging Message Passing and Continuous-Depth Models

Add code
Oct 24, 2023
Viaarxiv icon

Nominality Score Conditioned Time Series Anomaly Detection by Point/Sequential Reconstruction

Add code
Oct 24, 2023
Viaarxiv icon

NeurOLight: A Physics-Agnostic Neural Operator Enabling Parametric Photonic Device Simulation

Add code
Sep 19, 2022
Figure 1 for NeurOLight: A Physics-Agnostic Neural Operator Enabling Parametric Photonic Device Simulation
Figure 2 for NeurOLight: A Physics-Agnostic Neural Operator Enabling Parametric Photonic Device Simulation
Figure 3 for NeurOLight: A Physics-Agnostic Neural Operator Enabling Parametric Photonic Device Simulation
Figure 4 for NeurOLight: A Physics-Agnostic Neural Operator Enabling Parametric Photonic Device Simulation
Viaarxiv icon

Learning from Multiple Annotator Noisy Labels via Sample-wise Label Fusion

Add code
Jul 22, 2022
Figure 1 for Learning from Multiple Annotator Noisy Labels via Sample-wise Label Fusion
Figure 2 for Learning from Multiple Annotator Noisy Labels via Sample-wise Label Fusion
Figure 3 for Learning from Multiple Annotator Noisy Labels via Sample-wise Label Fusion
Figure 4 for Learning from Multiple Annotator Noisy Labels via Sample-wise Label Fusion
Viaarxiv icon

A Simple Data Mixing Prior for Improving Self-Supervised Learning

Add code
Jun 15, 2022
Figure 1 for A Simple Data Mixing Prior for Improving Self-Supervised Learning
Figure 2 for A Simple Data Mixing Prior for Improving Self-Supervised Learning
Figure 3 for A Simple Data Mixing Prior for Improving Self-Supervised Learning
Figure 4 for A Simple Data Mixing Prior for Improving Self-Supervised Learning
Viaarxiv icon

The Modality Focusing Hypothesis: On the Blink of Multimodal Knowledge Distillation

Add code
Jun 13, 2022
Figure 1 for The Modality Focusing Hypothesis: On the Blink of Multimodal Knowledge Distillation
Figure 2 for The Modality Focusing Hypothesis: On the Blink of Multimodal Knowledge Distillation
Figure 3 for The Modality Focusing Hypothesis: On the Blink of Multimodal Knowledge Distillation
Figure 4 for The Modality Focusing Hypothesis: On the Blink of Multimodal Knowledge Distillation
Viaarxiv icon

Training-Free Robust Multimodal Learning via Sample-Wise Jacobian Regularization

Add code
Apr 05, 2022
Figure 1 for Training-Free Robust Multimodal Learning via Sample-Wise Jacobian Regularization
Figure 2 for Training-Free Robust Multimodal Learning via Sample-Wise Jacobian Regularization
Figure 3 for Training-Free Robust Multimodal Learning via Sample-Wise Jacobian Regularization
Figure 4 for Training-Free Robust Multimodal Learning via Sample-Wise Jacobian Regularization
Viaarxiv icon

Co-advise: Cross Inductive Bias Distillation

Add code
Jun 23, 2021
Figure 1 for Co-advise: Cross Inductive Bias Distillation
Figure 2 for Co-advise: Cross Inductive Bias Distillation
Figure 3 for Co-advise: Cross Inductive Bias Distillation
Figure 4 for Co-advise: Cross Inductive Bias Distillation
Viaarxiv icon