Abstract:Complex networks describe important structures in nature and society, composed of nodes and the edges that connect them. The evolution of these networks is typically described by dynamics, which are labor-intensive and require expert knowledge to derive. However, because the complex network involves noisy observations from multiple trajectories of nodes, existing symbolic regression methods are either not applicable or ineffective on its dynamics. In this paper, we propose Physically Inspired Neural Dynamics Symbolic Regression (PI-NDSR), a method based on neural networks and genetic programming to automatically learn the symbolic expression of dynamics. Our method consists of two key components: a Physically Inspired Neural Dynamics (PIND) to augment and denoise trajectories through observed trajectory interpolation; and a coordinated genetic search algorithm to derive symbolic expressions. This algorithm leverages references of node dynamics and edge dynamics from neural dynamics to avoid overfitted expressions in symbolic space. We evaluate our method on synthetic datasets generated by various dynamics and real datasets on disease spreading. The results demonstrate that PI-NDSR outperforms the existing method in terms of both recovery probability and error.
Abstract:Message passing mechanism contributes to the success of GNNs in various applications, but also brings the oversquashing problem. Recent works combat oversquashing by improving the graph spectrums with rewiring techniques, disrupting the structural bias in graphs, and having limited improvement on oversquashing in terms of oversquashing measure. Motivated by unitary RNN, we propose Graph Unitary Message Passing (GUMP) to alleviate oversquashing in GNNs by applying unitary adjacency matrix for message passing. To design GUMP, a transformation is first proposed to make general graphs have unitary adjacency matrix and keep its structural bias. Then, unitary adjacency matrix is obtained with a unitary projection algorithm, which is implemented by utilizing the intrinsic structure of unitary adjacency matrix and allows GUMP to be permutation-equivariant. Experimental results show the effectiveness of GUMP in improving the performance on various graph learning tasks.
Abstract:Graph Neural Networks (GNNs) have been recently introduced to learn from knowledge graph (KG) and achieved state-of-the-art performance in KG reasoning. However, a theoretical certification for their good empirical performance is still absent. Besides, while logic in KG is important for inductive and interpretable inference, existing GNN-based methods are just designed to fit data distributions with limited knowledge of their logical expressiveness. We propose to fill the above gap in this paper. Specifically, we theoretically analyze GNN from logical expressiveness and find out what kind of logical rules can be captured from KG. Our results first show that GNN can capture logical rules from graded modal logic, providing a new theoretical tool for analyzing the expressiveness of GNN for KG reasoning; and a query labeling trick makes it easier for GNN to capture logical rules, explaining why SOTA methods are mainly based on labeling trick. Finally, insights from our theory motivate the development of an entity labeling method for capturing difficult logical rules. Experimental results are consistent with our theoretical results and verify the effectiveness of our proposed method.