Detail

This page contains automatically translated content.

Back
11/22/2023 | Intelligent Embedded Systems

New workshop contributions at NeurIPS 2023

This year, two contributions by IES authors, among others, made it to workshops at the NeurIPS (Neural Information Processing Systems) conference:

  • Veronica Lachi, Alice Moallemy-Oureh, Andreas Roth and Pascal Welke presented a workshop contribution entitled Graph Pooling Provably Improves Expressivity at the NeurIPS 2023 Workshop New Frontiers in Graph Learning. The content:
    In the domain of graph neural networks (GNNs), pooling operators are fundamental to reduce the size of the graph by simplifying graph structures and vertex features. Recent advances have shown that well-designed pooling operators, coupled with message-passing layers, can endow hierarchical GNNs with an expressive power regarding the graph isomorphism test that is equal to the Weisfeiler-Leman test. However, the ability of hierarchical GNNs to increase expressive power by utilizing graph coarsening was not yet explored. This results in uncertainties about the benefits of pooling operators and a lack of sufficient properties to guide their design. In this work, we identify conditions for pooling operators to generate WL-distinguishable coarsened graphs from originally WL-indistinguishable but non-isomorphic graphs. Our conditions are versatile and can be tailored to specific tasks and data characteristics, offering a promising avenue for further research.
  • Alice Moallemy-Oureh, Silvia Beddar-Wiesing, RĂ¼diger Nather and Josephine Thomas presented their contribution Marked Neural Spatio-Temporal Point Process Involving a Dynamic Graph Neural Network at the Temporal Graph Learning Workshop. The content:
    Spatio-Temporal Point Processes (STPPs) have recently become increasingly interesting for learning dynamic graph data since many scientific fields, ranging from mathematics, biology, social sciences, and physics to computer science, are naturally related and dynamic. While training Recurrent Neural Networks and solving PDEs for representing temporal data is expensive, TPPs were a good alternative. The drawback is that constructing an appropriate TPP for modeling temporal data requires the assumption of a particular temporal behavior of the data. To overcome this problem, Neural TPPs have been developed that enable learning of the parameters of the TPP. However, the research is relatively young for modeling dynamic graphs, and only a few TPPs have been proposed to handle edge-dynamic graphs. To allow for learning on a fully dynamic graph, we propose the first Marked Neural Spatio-Temporal Point Process (MNSTPP) that leverages a Dynamic Graph Neural Network to learn Spatio-TPPs to model and predict any event in a graph stream. In addition, our model can be updated efficiently by considering single events for local retraining.