Academic literature on the topic 'Weisfeiler-Lehman graph isomorphism'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Weisfeiler-Lehman graph isomorphism.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Weisfeiler-Lehman graph isomorphism"

1

Schulz, Till, Pascal Welke, and Stefan Wrobel. "Graph Filtration Kernels." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 8 (June 28, 2022): 8196–203. http://dx.doi.org/10.1609/aaai.v36i8.20793.

Full text
Abstract:
The majority of popular graph kernels is based on the concept of Haussler's R-convolution kernel and defines graph similarities in terms of mutual substructures. In this work, we enrich these similarity measures by considering graph filtrations: Using meaningful orders on the set of edges, which allow to construct a sequence of nested graphs, we can consider a graph at multiple granularities. A key concept of our approach is to track graph features over the course of such graph resolutions. Rather than to simply compare frequencies of features in graphs, this allows for their comparison in terms of when and for how long they exist in the sequences. In this work, we propose a family of graph kernels that incorporate these existence intervals of features. While our approach can be applied to arbitrary graph features, we particularly highlight Weisfeiler-Lehman vertex labels, leading to efficient kernels. We show that using Weisfeiler-Lehman labels over certain filtrations strictly increases the expressive power over the ordinary Weisfeiler-Lehman procedure in terms of deciding graph isomorphism. In fact, this result directly yields more powerful graph kernels based on such features and has implications to graph neural networks due to their close relationship to the Weisfeiler-Lehman method. We empirically validate the expressive power of our graph kernels and show significant improvements over state-of-the-art graph kernels in terms of predictive performance on various real-world benchmark datasets.
APA, Harvard, Vancouver, ISO, and other styles
2

Yang, Sihong, Dezhi Jin, Jun Liu, and Ye He. "Identification of Young High-Functioning Autism Individuals Based on Functional Connectome Using Graph Isomorphism Network: A Pilot Study." Brain Sciences 12, no. 7 (July 5, 2022): 883. http://dx.doi.org/10.3390/brainsci12070883.

Full text
Abstract:
Accumulated studies have determined the changes in functional connectivity in autism spectrum disorder (ASD) and spurred the application of machine learning for classifying ASD. Graph Neural Network provides a new method for network analysis in brain disorders to identify the underlying network features associated with functional deficits. Here, we proposed an improved model of Graph Isomorphism Network (GIN) that implements the Weisfeiler-Lehman (WL) graph isomorphism test to learn the graph features while taking into account the importance of each node in the classification to improve the interpretability of the algorithm. We applied the proposed method on multisite datasets of resting-state functional connectome from Autism Brain Imaging Data Exchange (ABIDE) after stringent quality control. The proposed method outperformed other commonly used classification methods on five different evaluation metrics. We also identified salient ROIs in visual and frontoparietal control networks, which could provide potential neuroimaging biomarkers for ASD identification.
APA, Harvard, Vancouver, ISO, and other styles
3

Feng, Aosong, Chenyu You, Shiqiang Wang, and Leandros Tassiulas. "KerGNNs: Interpretable Graph Neural Networks with Graph Kernels." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 6 (June 28, 2022): 6614–22. http://dx.doi.org/10.1609/aaai.v36i6.20615.

Full text
Abstract:
Graph kernels are historically the most widely-used technique for graph classification tasks. However, these methods suffer from limited performance because of the hand-crafted combinatorial features of graphs. In recent years, graph neural networks (GNNs) have become the state-of-the-art method in downstream graph-related tasks due to their superior performance. Most GNNs are based on Message Passing Neural Network (MPNN) frameworks. However, recent studies show that MPNNs can not exceed the power of the Weisfeiler-Lehman (WL) algorithm in graph isomorphism test. To address the limitations of existing graph kernel and GNN methods, in this paper, we propose a novel GNN framework, termed Kernel Graph Neural Networks (KerGNNs), which integrates graph kernels into the message passing process of GNNs. Inspired by convolution filters in convolutional neural networks (CNNs), KerGNNs adopt trainable hidden graphs as graph filters which are combined with subgraphs to update node embeddings using graph kernels. In addition, we show that MPNNs can be viewed as special cases of KerGNNs. We apply KerGNNs to multiple graph-related tasks and use cross-validation to make fair comparisons with benchmarks. We show that our method achieves competitive performance compared with existing state-of-the-art methods, demonstrating the potential to increase the representation ability of GNNs. We also show that the trained graph filters in KerGNNs can reveal the local graph structures of the dataset, which significantly improves the model interpretability compared with conventional GNN models.
APA, Harvard, Vancouver, ISO, and other styles
4

Wu, Jun, Jingrui He, and Elizabeth Ainsworth. "Non-IID Transfer Learning on Graphs." Proceedings of the AAAI Conference on Artificial Intelligence 37, no. 9 (June 26, 2023): 10342–50. http://dx.doi.org/10.1609/aaai.v37i9.26231.

Full text
Abstract:
Transfer learning refers to the transfer of knowledge or information from a relevant source domain to a target domain. However, most existing transfer learning theories and algorithms focus on IID tasks, where the source/target samples are assumed to be independent and identically distributed. Very little effort is devoted to theoretically studying the knowledge transferability on non-IID tasks, e.g., cross-network mining. To bridge the gap, in this paper, we propose rigorous generalization bounds and algorithms for cross-network transfer learning from a source graph to a target graph. The crucial idea is to characterize the cross-network knowledge transferability from the perspective of the Weisfeiler-Lehman graph isomorphism test. To this end, we propose a novel Graph Subtree Discrepancy to measure the graph distribution shift between source and target graphs. Then the generalization error bounds on cross-network transfer learning, including both cross-network node classification and link prediction tasks, can be derived in terms of the source knowledge and the Graph Subtree Discrepancy across domains. This thereby motivates us to propose a generic graph adaptive network (GRADE) to minimize the distribution shift between source and target graphs for cross-network transfer learning. Experimental results verify the effectiveness and efficiency of our GRADE framework on both cross-network node classification and cross-domain recommendation tasks.
APA, Harvard, Vancouver, ISO, and other styles
5

You, Jiaxuan, Jonathan M. Gomes-Selman, Rex Ying, and Jure Leskovec. "Identity-aware Graph Neural Networks." Proceedings of the AAAI Conference on Artificial Intelligence 35, no. 12 (May 18, 2021): 10737–45. http://dx.doi.org/10.1609/aaai.v35i12.17283.

Full text
Abstract:
Message passing Graph Neural Networks (GNNs) provide a powerful modeling framework for relational data. However, the expressive power of existing GNNs is upper-bounded by the 1-Weisfeiler-Lehman (1-WL) graph isomorphism test, which means GNNs that are not able to predict node clustering coefficients and shortest path distances, and cannot differentiate between different d-regular graphs. Here we develop a class of message passing GNNs, named Identity-aware Graph Neural Networks (ID-GNNs), with greater expressive power than the 1-WL test. ID-GNN offers a minimal but powerful solution to limitations of existing GNNs. ID-GNN extends existing GNN architectures by inductively considering nodes’ identities during message passing. To embed a given node, ID-GNN first extracts the ego network centered at the node, then conducts rounds of heterogeneous message passing, where different sets of parameters are applied to the center node than to other surrounding nodes in the ego network. We further propose a simplified but faster version of ID-GNN that injects node identity information as augmented node features. Alto- gether, both versions of ID-GNN represent general extensions of message passing GNNs, where experiments show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks; 3% accuracy improvement on node and graph classification benchmarks; and 15% ROC AUC improvement on real-world link prediction tasks. Additionally, ID-GNNs demonstrate improved or comparable performance over other task-specific graph networks.
APA, Harvard, Vancouver, ISO, and other styles
6

GROHE, MARTIN, and MARTIN OTTO. "PEBBLE GAMES AND LINEAR EQUATIONS." Journal of Symbolic Logic 80, no. 3 (July 22, 2015): 797–844. http://dx.doi.org/10.1017/jsl.2015.28.

Full text
Abstract:
AbstractWe give a new, simplified and detailed account of the correspondence between levels of the Sherali–Adams relaxation of graph isomorphism and levels of pebble-game equivalence with counting (higher-dimensional Weisfeiler–Lehman colour refinement). The correspondence between basic colour refinement and fractional isomorphism, due to Tinhofer [22; 23] and Ramana, Scheinerman and Ullman [17], is re-interpreted as the base level of Sherali–Adams and generalised to higher levels in this sense by Atserias and Maneva [1] and Malkin [14], who prove that the two resulting hierarchies interleave. In carrying this analysis further, we here give (a) a precise characterisation of the level k Sherali–Adams relaxation in terms of a modified counting pebble game; (b) a variant of the Sherali–Adams levels that precisely match the k-pebble counting game; (c) a proof that the interleaving between these two hierarchies is strict. We also investigate the variation based on boolean arithmetic instead of real/rational arithmetic and obtain analogous correspondences and separations for plain k-pebble equivalence (without counting). Our results are driven by considerably simplified accounts of the underlying combinatorics and linear algebra.
APA, Harvard, Vancouver, ISO, and other styles
7

Evdokimov, Sergei, and Ilia Ponomarenko. "On Highly Closed Cellular Algebras and Highly Closed Isomorphisms." Electronic Journal of Combinatorics 6, no. 1 (November 6, 1998). http://dx.doi.org/10.37236/1450.

Full text
Abstract:
We define and study $m$-closed cellular algebras (coherent configurations) and $m$-isomorphisms of cellular algebras which can be regarded as $m$th approximations of Schurian algebras (i.e. the centralizer algebras of permutation groups) and of strong isomorphisms (i.e. bijections of the point sets taking one algebra to the other) respectively. If $m=1$ we come to arbitrary cellular algebras and their weak isomorphisms (i.e. matrix algebra isomorphisms preserving the Hadamard multiplication). On the other hand, the algebras which are $m$-closed for all $m\ge 1$ are exactly Schurian ones whereas the weak isomorphisms which are $m$-isomorphisms for all $m\ge 1$ are exactly ones induced by strong isomorphisms. We show that for any $m$ there exist $m$-closed algebras on $O(m)$ points which are not Schurian and $m$-isomorphisms of cellular algebras on $O(m)$ points which are not induced by strong isomorphisms. This enables us to find for any $m$ an edge colored graph with $O(m)$ vertices satisfying the $m$-vertex condition and having non-Schurian adjacency algebra. On the other hand, we rediscover and explain from the algebraic point of view the Cai-Fürer-Immerman phenomenon that the $m$-dimensional Weisfeiler-Lehman method fails to recognize the isomorphism of graphs in an efficient way.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Weisfeiler-Lehman graph isomorphism"

1

Doshi, Siddhanth Rahul. "Graph Neural Networks with Parallel Local Neighborhood Aggregations." Thesis, 2022. https://etd.iisc.ac.in/handle/2005/5762.

Full text
Abstract:
Graph neural networks (GNNs) have become very popular for processing and analyzing graph-structured data in the last few years. Using message passing as their basic building blocks that aggregate information from neighborhoods, GNN architectures learn low-dimensional graph-level or node-level embeddings useful for several downstream machine learning tasks. In this thesis, we focus on GNN architectures that perform parallel neighborhood aggregations (in short, referred to as PA-GNNs) for two tasks, namely, graph classification and link prediction. Such architectures have a natural advantage of reduced training and inference time as neighborhood aggregation is done before training, unlike GNNs that perform the neighborhood aggregation sequentially (referred to as SA-GNNs) during training. Thus, the runtime of SA-GNNs depends on the number of edges in the graph. In the first part of the thesis, we propose a generic model for GNNs with parallel neighborhood aggregation and theoretically characterize the discriminative power of PA-GNN models to discriminate between two non-isomorphic graphs. We provide conditions under which they are provably powerful as the well-known Weisfeiler-Lehman graph isomorphism test. We then specialize the generic PA-GNN model and propose a GNN architecture, which we call SPIN: simple and parallel graph isomorphism network. Next to the theoretical characterization of the developed PA-GNN model, we also present numerical experiments on a diverse variety of real-world benchmark graph classification datasets related to social networks, chemical molecular data, and brain networks. The proposed model achieves state-of-the-art performance with reduced training and inference time. In the second part of the thesis, we propose a computational method for drug repurposing by presenting a deep learning model that captures the complex interactions between the drugs, diseases, genes, and anatomies in a large-scale interactome with over 1.4 million connections. Specifically, we propose a PA-GNN based drug repurposing architecture, which we call GDRnet, to screen a large drug database of clinically approved drugs and predict potential drugs that can be repurposed for novel diseases. GNN-based machine learning models are a natural choice for computational drug repurposing because of their ability to capture the underlying structural information in such complex biological networks. While the proposed PA-GNN architecture is computationally attractive, we present results from numerical experiments to show the efficacy of GNNs for drug repurposing. We also provide numerical experimental results on drug repurposing for coronavirus diseases (including COVID-19), where many of the drugs predicted by the proposed model are considered as the mainstay treatment.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Weisfeiler-Lehman graph isomorphism"

1

Da San Martino, Giovanni, Nicolò Navarin, and Alessandro Sperduti. "Graph Kernels Exploiting Weisfeiler-Lehman Graph Isomorphism Test Extensions." In Neural Information Processing, 93–100. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-12640-1_12.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography