Infothek
This page contains automatically translated content.
New journal article in Transactions on Machine Learning Research
Pascal Plettenberg, Dominik Köhler, Bernhard Sick and Josephine M. Thomas present a new article titled Flow-Attentional Graph Neural Networks in Transactions on Machine Learning Research (2025).
Abstract: Graph Neural Networks (GNNs) have become essential for learning from graph-structured data. However, existing GNNs do not consider the conservation law inherent in graphs associated with a flow of physical resources, such as electrical current in power grids or traffic in transportation networks, which can lead to reduced model performance. To address this, we propose flow attention, which adapts existing graph attention mechanisms to satisfy Kirchhoff's first law. Furthermore, we discuss how this modification influences the expressivity and identify sets of non-isomorphic graphs that can be discriminated by flow attention but not by standard attention. Through extensive experiments on two flow graph datasets-electronic circuits and power grids-we demonstrate that flow attention enhances the performance of attention-based GNNs on both graph-level classification and regression tasks.
Latest news
New journal article in Transactions on Machine Learning Research
Pascal Plettenberg, Dominik Köhler, Bernhard Sick and Josephine M. Thomas present a new article titled Flow-Attentional Graph Neural Networks in Transactions on Machine Learning Research (2025).
Abstract: Graph Neural Networks (GNNs) have become essential for learning from graph-structured data. However, existing GNNs do not consider the conservation law inherent in graphs associated with a flow of physical resources, such as electrical current in power grids or traffic in transportation networks, which can lead to reduced model performance. To address this, we propose flow attention, which adapts existing graph attention mechanisms to satisfy Kirchhoff's first law. Furthermore, we discuss how this modification influences the expressivity and identify sets of non-isomorphic graphs that can be discriminated by flow attention but not by standard attention. Through extensive experiments on two flow graph datasets-electronic circuits and power grids-we demonstrate that flow attention enhances the performance of attention-based GNNs on both graph-level classification and regression tasks.
Dates
New journal article in Transactions on Machine Learning Research
Pascal Plettenberg, Dominik Köhler, Bernhard Sick and Josephine M. Thomas present a new article titled Flow-Attentional Graph Neural Networks in Transactions on Machine Learning Research (2025).
Abstract: Graph Neural Networks (GNNs) have become essential for learning from graph-structured data. However, existing GNNs do not consider the conservation law inherent in graphs associated with a flow of physical resources, such as electrical current in power grids or traffic in transportation networks, which can lead to reduced model performance. To address this, we propose flow attention, which adapts existing graph attention mechanisms to satisfy Kirchhoff's first law. Furthermore, we discuss how this modification influences the expressivity and identify sets of non-isomorphic graphs that can be discriminated by flow attention but not by standard attention. Through extensive experiments on two flow graph datasets-electronic circuits and power grids-we demonstrate that flow attention enhances the performance of attention-based GNNs on both graph-level classification and regression tasks.