Littérature scientifique sur le sujet « Weisfeiler-Lehman graph isomorphism »

Créez une référence correcte selon les styles APA, MLA, Chicago, Harvard et plusieurs autres

Choisissez une source :

Consultez les listes thématiques d’articles de revues, de livres, de thèses, de rapports de conférences et d’autres sources académiques sur le sujet « Weisfeiler-Lehman graph isomorphism ».

À côté de chaque source dans la liste de références il y a un bouton « Ajouter à la bibliographie ». Cliquez sur ce bouton, et nous générerons automatiquement la référence bibliographique pour la source choisie selon votre style de citation préféré : APA, MLA, Harvard, Vancouver, Chicago, etc.

Vous pouvez aussi télécharger le texte intégral de la publication scolaire au format pdf et consulter son résumé en ligne lorsque ces informations sont inclues dans les métadonnées.

Articles de revues sur le sujet "Weisfeiler-Lehman graph isomorphism"

1

Schulz, Till, Pascal Welke et Stefan Wrobel. « Graph Filtration Kernels ». Proceedings of the AAAI Conference on Artificial Intelligence 36, no 8 (28 juin 2022) : 8196–203. http://dx.doi.org/10.1609/aaai.v36i8.20793.

Texte intégral
Résumé :
The majority of popular graph kernels is based on the concept of Haussler's R-convolution kernel and defines graph similarities in terms of mutual substructures. In this work, we enrich these similarity measures by considering graph filtrations: Using meaningful orders on the set of edges, which allow to construct a sequence of nested graphs, we can consider a graph at multiple granularities. A key concept of our approach is to track graph features over the course of such graph resolutions. Rather than to simply compare frequencies of features in graphs, this allows for their comparison in terms of when and for how long they exist in the sequences. In this work, we propose a family of graph kernels that incorporate these existence intervals of features. While our approach can be applied to arbitrary graph features, we particularly highlight Weisfeiler-Lehman vertex labels, leading to efficient kernels. We show that using Weisfeiler-Lehman labels over certain filtrations strictly increases the expressive power over the ordinary Weisfeiler-Lehman procedure in terms of deciding graph isomorphism. In fact, this result directly yields more powerful graph kernels based on such features and has implications to graph neural networks due to their close relationship to the Weisfeiler-Lehman method. We empirically validate the expressive power of our graph kernels and show significant improvements over state-of-the-art graph kernels in terms of predictive performance on various real-world benchmark datasets.
Styles APA, Harvard, Vancouver, ISO, etc.
2

Yang, Sihong, Dezhi Jin, Jun Liu et Ye He. « Identification of Young High-Functioning Autism Individuals Based on Functional Connectome Using Graph Isomorphism Network : A Pilot Study ». Brain Sciences 12, no 7 (5 juillet 2022) : 883. http://dx.doi.org/10.3390/brainsci12070883.

Texte intégral
Résumé :
Accumulated studies have determined the changes in functional connectivity in autism spectrum disorder (ASD) and spurred the application of machine learning for classifying ASD. Graph Neural Network provides a new method for network analysis in brain disorders to identify the underlying network features associated with functional deficits. Here, we proposed an improved model of Graph Isomorphism Network (GIN) that implements the Weisfeiler-Lehman (WL) graph isomorphism test to learn the graph features while taking into account the importance of each node in the classification to improve the interpretability of the algorithm. We applied the proposed method on multisite datasets of resting-state functional connectome from Autism Brain Imaging Data Exchange (ABIDE) after stringent quality control. The proposed method outperformed other commonly used classification methods on five different evaluation metrics. We also identified salient ROIs in visual and frontoparietal control networks, which could provide potential neuroimaging biomarkers for ASD identification.
Styles APA, Harvard, Vancouver, ISO, etc.
3

Feng, Aosong, Chenyu You, Shiqiang Wang et Leandros Tassiulas. « KerGNNs : Interpretable Graph Neural Networks with Graph Kernels ». Proceedings of the AAAI Conference on Artificial Intelligence 36, no 6 (28 juin 2022) : 6614–22. http://dx.doi.org/10.1609/aaai.v36i6.20615.

Texte intégral
Résumé :
Graph kernels are historically the most widely-used technique for graph classification tasks. However, these methods suffer from limited performance because of the hand-crafted combinatorial features of graphs. In recent years, graph neural networks (GNNs) have become the state-of-the-art method in downstream graph-related tasks due to their superior performance. Most GNNs are based on Message Passing Neural Network (MPNN) frameworks. However, recent studies show that MPNNs can not exceed the power of the Weisfeiler-Lehman (WL) algorithm in graph isomorphism test. To address the limitations of existing graph kernel and GNN methods, in this paper, we propose a novel GNN framework, termed Kernel Graph Neural Networks (KerGNNs), which integrates graph kernels into the message passing process of GNNs. Inspired by convolution filters in convolutional neural networks (CNNs), KerGNNs adopt trainable hidden graphs as graph filters which are combined with subgraphs to update node embeddings using graph kernels. In addition, we show that MPNNs can be viewed as special cases of KerGNNs. We apply KerGNNs to multiple graph-related tasks and use cross-validation to make fair comparisons with benchmarks. We show that our method achieves competitive performance compared with existing state-of-the-art methods, demonstrating the potential to increase the representation ability of GNNs. We also show that the trained graph filters in KerGNNs can reveal the local graph structures of the dataset, which significantly improves the model interpretability compared with conventional GNN models.
Styles APA, Harvard, Vancouver, ISO, etc.
4

Wu, Jun, Jingrui He et Elizabeth Ainsworth. « Non-IID Transfer Learning on Graphs ». Proceedings of the AAAI Conference on Artificial Intelligence 37, no 9 (26 juin 2023) : 10342–50. http://dx.doi.org/10.1609/aaai.v37i9.26231.

Texte intégral
Résumé :
Transfer learning refers to the transfer of knowledge or information from a relevant source domain to a target domain. However, most existing transfer learning theories and algorithms focus on IID tasks, where the source/target samples are assumed to be independent and identically distributed. Very little effort is devoted to theoretically studying the knowledge transferability on non-IID tasks, e.g., cross-network mining. To bridge the gap, in this paper, we propose rigorous generalization bounds and algorithms for cross-network transfer learning from a source graph to a target graph. The crucial idea is to characterize the cross-network knowledge transferability from the perspective of the Weisfeiler-Lehman graph isomorphism test. To this end, we propose a novel Graph Subtree Discrepancy to measure the graph distribution shift between source and target graphs. Then the generalization error bounds on cross-network transfer learning, including both cross-network node classification and link prediction tasks, can be derived in terms of the source knowledge and the Graph Subtree Discrepancy across domains. This thereby motivates us to propose a generic graph adaptive network (GRADE) to minimize the distribution shift between source and target graphs for cross-network transfer learning. Experimental results verify the effectiveness and efficiency of our GRADE framework on both cross-network node classification and cross-domain recommendation tasks.
Styles APA, Harvard, Vancouver, ISO, etc.
5

You, Jiaxuan, Jonathan M. Gomes-Selman, Rex Ying et Jure Leskovec. « Identity-aware Graph Neural Networks ». Proceedings of the AAAI Conference on Artificial Intelligence 35, no 12 (18 mai 2021) : 10737–45. http://dx.doi.org/10.1609/aaai.v35i12.17283.

Texte intégral
Résumé :
Message passing Graph Neural Networks (GNNs) provide a powerful modeling framework for relational data. However, the expressive power of existing GNNs is upper-bounded by the 1-Weisfeiler-Lehman (1-WL) graph isomorphism test, which means GNNs that are not able to predict node clustering coefficients and shortest path distances, and cannot differentiate between different d-regular graphs. Here we develop a class of message passing GNNs, named Identity-aware Graph Neural Networks (ID-GNNs), with greater expressive power than the 1-WL test. ID-GNN offers a minimal but powerful solution to limitations of existing GNNs. ID-GNN extends existing GNN architectures by inductively considering nodes’ identities during message passing. To embed a given node, ID-GNN first extracts the ego network centered at the node, then conducts rounds of heterogeneous message passing, where different sets of parameters are applied to the center node than to other surrounding nodes in the ego network. We further propose a simplified but faster version of ID-GNN that injects node identity information as augmented node features. Alto- gether, both versions of ID-GNN represent general extensions of message passing GNNs, where experiments show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks; 3% accuracy improvement on node and graph classification benchmarks; and 15% ROC AUC improvement on real-world link prediction tasks. Additionally, ID-GNNs demonstrate improved or comparable performance over other task-specific graph networks.
Styles APA, Harvard, Vancouver, ISO, etc.
6

GROHE, MARTIN, et MARTIN OTTO. « PEBBLE GAMES AND LINEAR EQUATIONS ». Journal of Symbolic Logic 80, no 3 (22 juillet 2015) : 797–844. http://dx.doi.org/10.1017/jsl.2015.28.

Texte intégral
Résumé :
AbstractWe give a new, simplified and detailed account of the correspondence between levels of the Sherali–Adams relaxation of graph isomorphism and levels of pebble-game equivalence with counting (higher-dimensional Weisfeiler–Lehman colour refinement). The correspondence between basic colour refinement and fractional isomorphism, due to Tinhofer [22; 23] and Ramana, Scheinerman and Ullman [17], is re-interpreted as the base level of Sherali–Adams and generalised to higher levels in this sense by Atserias and Maneva [1] and Malkin [14], who prove that the two resulting hierarchies interleave. In carrying this analysis further, we here give (a) a precise characterisation of the level k Sherali–Adams relaxation in terms of a modified counting pebble game; (b) a variant of the Sherali–Adams levels that precisely match the k-pebble counting game; (c) a proof that the interleaving between these two hierarchies is strict. We also investigate the variation based on boolean arithmetic instead of real/rational arithmetic and obtain analogous correspondences and separations for plain k-pebble equivalence (without counting). Our results are driven by considerably simplified accounts of the underlying combinatorics and linear algebra.
Styles APA, Harvard, Vancouver, ISO, etc.
7

Evdokimov, Sergei, et Ilia Ponomarenko. « On Highly Closed Cellular Algebras and Highly Closed Isomorphisms ». Electronic Journal of Combinatorics 6, no 1 (6 novembre 1998). http://dx.doi.org/10.37236/1450.

Texte intégral
Résumé :
We define and study $m$-closed cellular algebras (coherent configurations) and $m$-isomorphisms of cellular algebras which can be regarded as $m$th approximations of Schurian algebras (i.e. the centralizer algebras of permutation groups) and of strong isomorphisms (i.e. bijections of the point sets taking one algebra to the other) respectively. If $m=1$ we come to arbitrary cellular algebras and their weak isomorphisms (i.e. matrix algebra isomorphisms preserving the Hadamard multiplication). On the other hand, the algebras which are $m$-closed for all $m\ge 1$ are exactly Schurian ones whereas the weak isomorphisms which are $m$-isomorphisms for all $m\ge 1$ are exactly ones induced by strong isomorphisms. We show that for any $m$ there exist $m$-closed algebras on $O(m)$ points which are not Schurian and $m$-isomorphisms of cellular algebras on $O(m)$ points which are not induced by strong isomorphisms. This enables us to find for any $m$ an edge colored graph with $O(m)$ vertices satisfying the $m$-vertex condition and having non-Schurian adjacency algebra. On the other hand, we rediscover and explain from the algebraic point of view the Cai-Fürer-Immerman phenomenon that the $m$-dimensional Weisfeiler-Lehman method fails to recognize the isomorphism of graphs in an efficient way.
Styles APA, Harvard, Vancouver, ISO, etc.

Thèses sur le sujet "Weisfeiler-Lehman graph isomorphism"

1

Doshi, Siddhanth Rahul. « Graph Neural Networks with Parallel Local Neighborhood Aggregations ». Thesis, 2022. https://etd.iisc.ac.in/handle/2005/5762.

Texte intégral
Résumé :
Graph neural networks (GNNs) have become very popular for processing and analyzing graph-structured data in the last few years. Using message passing as their basic building blocks that aggregate information from neighborhoods, GNN architectures learn low-dimensional graph-level or node-level embeddings useful for several downstream machine learning tasks. In this thesis, we focus on GNN architectures that perform parallel neighborhood aggregations (in short, referred to as PA-GNNs) for two tasks, namely, graph classification and link prediction. Such architectures have a natural advantage of reduced training and inference time as neighborhood aggregation is done before training, unlike GNNs that perform the neighborhood aggregation sequentially (referred to as SA-GNNs) during training. Thus, the runtime of SA-GNNs depends on the number of edges in the graph. In the first part of the thesis, we propose a generic model for GNNs with parallel neighborhood aggregation and theoretically characterize the discriminative power of PA-GNN models to discriminate between two non-isomorphic graphs. We provide conditions under which they are provably powerful as the well-known Weisfeiler-Lehman graph isomorphism test. We then specialize the generic PA-GNN model and propose a GNN architecture, which we call SPIN: simple and parallel graph isomorphism network. Next to the theoretical characterization of the developed PA-GNN model, we also present numerical experiments on a diverse variety of real-world benchmark graph classification datasets related to social networks, chemical molecular data, and brain networks. The proposed model achieves state-of-the-art performance with reduced training and inference time. In the second part of the thesis, we propose a computational method for drug repurposing by presenting a deep learning model that captures the complex interactions between the drugs, diseases, genes, and anatomies in a large-scale interactome with over 1.4 million connections. Specifically, we propose a PA-GNN based drug repurposing architecture, which we call GDRnet, to screen a large drug database of clinically approved drugs and predict potential drugs that can be repurposed for novel diseases. GNN-based machine learning models are a natural choice for computational drug repurposing because of their ability to capture the underlying structural information in such complex biological networks. While the proposed PA-GNN architecture is computationally attractive, we present results from numerical experiments to show the efficacy of GNNs for drug repurposing. We also provide numerical experimental results on drug repurposing for coronavirus diseases (including COVID-19), where many of the drugs predicted by the proposed model are considered as the mainstay treatment.
Styles APA, Harvard, Vancouver, ISO, etc.

Chapitres de livres sur le sujet "Weisfeiler-Lehman graph isomorphism"

1

Da San Martino, Giovanni, Nicolò Navarin et Alessandro Sperduti. « Graph Kernels Exploiting Weisfeiler-Lehman Graph Isomorphism Test Extensions ». Dans Neural Information Processing, 93–100. Cham : Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-12640-1_12.

Texte intégral
Styles APA, Harvard, Vancouver, ISO, etc.
Nous offrons des réductions sur tous les plans premium pour les auteurs dont les œuvres sont incluses dans des sélections littéraires thématiques. Contactez-nous pour obtenir un code promo unique!

Vers la bibliographie